00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v22.11" build number 2025 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3290 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.014 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.015 The recommended git tool is: git 00:00:00.015 using credential 00000000-0000-0000-0000-000000000002 00:00:00.016 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.030 Fetching changes from the remote Git repository 00:00:00.031 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.047 Using shallow fetch with depth 1 00:00:00.047 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.047 > git --version # timeout=10 00:00:00.067 > git --version # 'git version 2.39.2' 00:00:00.067 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.085 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.085 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.181 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.193 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.205 Checking out Revision 456d80899d5187c68de113852b37bde1201fd33a (FETCH_HEAD) 00:00:02.205 > git config core.sparsecheckout # timeout=10 00:00:02.215 > git read-tree -mu HEAD # timeout=10 00:00:02.229 > git checkout -f 456d80899d5187c68de113852b37bde1201fd33a # timeout=5 00:00:02.260 Commit message: "jenkins/config: Drop WFP25 for maintenance" 00:00:02.261 > git rev-list --no-walk 456d80899d5187c68de113852b37bde1201fd33a # timeout=10 00:00:02.464 [Pipeline] Start of Pipeline 00:00:02.477 [Pipeline] library 00:00:02.479 Loading library shm_lib@master 00:00:02.479 Library shm_lib@master is cached. Copying from home. 00:00:02.493 [Pipeline] node 00:00:02.509 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.510 [Pipeline] { 00:00:02.518 [Pipeline] catchError 00:00:02.519 [Pipeline] { 00:00:02.529 [Pipeline] wrap 00:00:02.538 [Pipeline] { 00:00:02.547 [Pipeline] stage 00:00:02.548 [Pipeline] { (Prologue) 00:00:02.734 [Pipeline] sh 00:00:03.012 + logger -p user.info -t JENKINS-CI 00:00:03.025 [Pipeline] echo 00:00:03.026 Node: WFP50 00:00:03.032 [Pipeline] sh 00:00:03.322 [Pipeline] setCustomBuildProperty 00:00:03.331 [Pipeline] echo 00:00:03.332 Cleanup processes 00:00:03.336 [Pipeline] sh 00:00:03.615 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.615 3893676 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.628 [Pipeline] sh 00:00:03.905 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.905 ++ grep -v 'sudo pgrep' 00:00:03.905 ++ awk '{print $1}' 00:00:03.905 + sudo kill -9 00:00:03.905 + true 00:00:03.919 [Pipeline] cleanWs 00:00:03.928 [WS-CLEANUP] Deleting project workspace... 00:00:03.928 [WS-CLEANUP] Deferred wipeout is used... 00:00:03.934 [WS-CLEANUP] done 00:00:03.938 [Pipeline] setCustomBuildProperty 00:00:03.949 [Pipeline] sh 00:00:04.235 + sudo git config --global --replace-all safe.directory '*' 00:00:04.304 [Pipeline] httpRequest 00:00:04.324 [Pipeline] echo 00:00:04.326 Sorcerer 10.211.164.101 is alive 00:00:04.333 [Pipeline] httpRequest 00:00:04.337 HttpMethod: GET 00:00:04.338 URL: http://10.211.164.101/packages/jbp_456d80899d5187c68de113852b37bde1201fd33a.tar.gz 00:00:04.338 Sending request to url: http://10.211.164.101/packages/jbp_456d80899d5187c68de113852b37bde1201fd33a.tar.gz 00:00:04.340 Response Code: HTTP/1.1 200 OK 00:00:04.341 Success: Status code 200 is in the accepted range: 200,404 00:00:04.341 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_456d80899d5187c68de113852b37bde1201fd33a.tar.gz 00:00:05.021 [Pipeline] sh 00:00:05.303 + tar --no-same-owner -xf jbp_456d80899d5187c68de113852b37bde1201fd33a.tar.gz 00:00:05.316 [Pipeline] httpRequest 00:00:05.342 [Pipeline] echo 00:00:05.344 Sorcerer 10.211.164.101 is alive 00:00:05.352 [Pipeline] httpRequest 00:00:05.356 HttpMethod: GET 00:00:05.356 URL: http://10.211.164.101/packages/spdk_b8378f94e02ef4dd21e7023626f6c3b47a36f5c1.tar.gz 00:00:05.357 Sending request to url: http://10.211.164.101/packages/spdk_b8378f94e02ef4dd21e7023626f6c3b47a36f5c1.tar.gz 00:00:05.370 Response Code: HTTP/1.1 200 OK 00:00:05.371 Success: Status code 200 is in the accepted range: 200,404 00:00:05.371 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_b8378f94e02ef4dd21e7023626f6c3b47a36f5c1.tar.gz 00:00:33.436 [Pipeline] sh 00:00:33.722 + tar --no-same-owner -xf spdk_b8378f94e02ef4dd21e7023626f6c3b47a36f5c1.tar.gz 00:00:37.927 [Pipeline] sh 00:00:38.210 + git -C spdk log --oneline -n5 00:00:38.210 b8378f94e scripts/pkgdep: Set yum's skip_if_unavailable=True under rocky8 00:00:38.210 c2a77f51e module/bdev/nvme: add detach-monitor poller 00:00:38.210 e14876e17 lib/nvme: add spdk_nvme_scan_attached() 00:00:38.210 1d6dfcbeb nvme_pci: ctrlr_scan_attached callback 00:00:38.210 ff6594986 nvme_transport: optional callback to scan attached 00:00:38.225 [Pipeline] withCredentials 00:00:38.233 > git --version # timeout=10 00:00:38.245 > git --version # 'git version 2.39.2' 00:00:38.258 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:38.260 [Pipeline] { 00:00:38.268 [Pipeline] retry 00:00:38.269 [Pipeline] { 00:00:38.281 [Pipeline] sh 00:00:38.558 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:00:38.830 [Pipeline] } 00:00:38.851 [Pipeline] // retry 00:00:38.863 [Pipeline] } 00:00:38.884 [Pipeline] // withCredentials 00:00:38.895 [Pipeline] httpRequest 00:00:38.914 [Pipeline] echo 00:00:38.916 Sorcerer 10.211.164.101 is alive 00:00:38.926 [Pipeline] httpRequest 00:00:38.931 HttpMethod: GET 00:00:38.931 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:00:38.932 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:00:38.950 Response Code: HTTP/1.1 200 OK 00:00:38.951 Success: Status code 200 is in the accepted range: 200,404 00:00:38.951 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:00:54.979 [Pipeline] sh 00:00:55.263 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:00:57.810 [Pipeline] sh 00:00:58.125 + git -C dpdk log --oneline -n5 00:00:58.125 caf0f5d395 version: 22.11.4 00:00:58.125 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:00:58.125 dc9c799c7d vhost: fix missing spinlock unlock 00:00:58.125 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:00:58.125 6ef77f2a5e net/gve: fix RX buffer size alignment 00:00:58.135 [Pipeline] } 00:00:58.152 [Pipeline] // stage 00:00:58.161 [Pipeline] stage 00:00:58.163 [Pipeline] { (Prepare) 00:00:58.182 [Pipeline] writeFile 00:00:58.198 [Pipeline] sh 00:00:58.479 + logger -p user.info -t JENKINS-CI 00:00:58.493 [Pipeline] sh 00:00:58.776 + logger -p user.info -t JENKINS-CI 00:00:58.788 [Pipeline] sh 00:00:59.072 + cat autorun-spdk.conf 00:00:59.072 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:59.072 SPDK_TEST_BLOCKDEV=1 00:00:59.072 SPDK_TEST_ISAL=1 00:00:59.072 SPDK_TEST_CRYPTO=1 00:00:59.072 SPDK_TEST_REDUCE=1 00:00:59.072 SPDK_TEST_VBDEV_COMPRESS=1 00:00:59.072 SPDK_RUN_UBSAN=1 00:00:59.072 SPDK_TEST_ACCEL=1 00:00:59.072 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:00:59.072 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:00:59.079 RUN_NIGHTLY=1 00:00:59.084 [Pipeline] readFile 00:00:59.111 [Pipeline] withEnv 00:00:59.113 [Pipeline] { 00:00:59.129 [Pipeline] sh 00:00:59.413 + set -ex 00:00:59.413 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:59.413 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:59.413 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:59.413 ++ SPDK_TEST_BLOCKDEV=1 00:00:59.413 ++ SPDK_TEST_ISAL=1 00:00:59.413 ++ SPDK_TEST_CRYPTO=1 00:00:59.413 ++ SPDK_TEST_REDUCE=1 00:00:59.413 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:59.413 ++ SPDK_RUN_UBSAN=1 00:00:59.413 ++ SPDK_TEST_ACCEL=1 00:00:59.413 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:00:59.413 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:00:59.413 ++ RUN_NIGHTLY=1 00:00:59.413 + case $SPDK_TEST_NVMF_NICS in 00:00:59.413 + DRIVERS= 00:00:59.413 + [[ -n '' ]] 00:00:59.413 + exit 0 00:00:59.421 [Pipeline] } 00:00:59.437 [Pipeline] // withEnv 00:00:59.443 [Pipeline] } 00:00:59.459 [Pipeline] // stage 00:00:59.468 [Pipeline] catchError 00:00:59.469 [Pipeline] { 00:00:59.483 [Pipeline] timeout 00:00:59.483 Timeout set to expire in 1 hr 0 min 00:00:59.485 [Pipeline] { 00:00:59.500 [Pipeline] stage 00:00:59.502 [Pipeline] { (Tests) 00:00:59.517 [Pipeline] sh 00:00:59.801 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:59.801 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:59.801 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:59.801 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:59.801 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:59.801 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:59.801 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:59.801 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:59.801 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:59.801 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:59.801 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:59.801 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:59.801 + source /etc/os-release 00:00:59.801 ++ NAME='Fedora Linux' 00:00:59.801 ++ VERSION='38 (Cloud Edition)' 00:00:59.801 ++ ID=fedora 00:00:59.801 ++ VERSION_ID=38 00:00:59.801 ++ VERSION_CODENAME= 00:00:59.801 ++ PLATFORM_ID=platform:f38 00:00:59.801 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:59.801 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:59.801 ++ LOGO=fedora-logo-icon 00:00:59.802 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:59.802 ++ HOME_URL=https://fedoraproject.org/ 00:00:59.802 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:59.802 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:59.802 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:59.802 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:59.802 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:59.802 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:59.802 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:59.802 ++ SUPPORT_END=2024-05-14 00:00:59.802 ++ VARIANT='Cloud Edition' 00:00:59.802 ++ VARIANT_ID=cloud 00:00:59.802 + uname -a 00:00:59.802 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:59.802 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:03.095 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:01:03.095 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:03.095 Hugepages 00:01:03.095 node hugesize free / total 00:01:03.095 node0 1048576kB 0 / 0 00:01:03.095 node0 2048kB 0 / 0 00:01:03.095 node1 1048576kB 0 / 0 00:01:03.095 node1 2048kB 0 / 0 00:01:03.095 00:01:03.095 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:03.095 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:03.095 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:03.095 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:03.095 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:03.095 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:03.095 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:01:03.095 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:01:03.095 + rm -f /tmp/spdk-ld-path 00:01:03.095 + source autorun-spdk.conf 00:01:03.095 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.095 ++ SPDK_TEST_BLOCKDEV=1 00:01:03.095 ++ SPDK_TEST_ISAL=1 00:01:03.095 ++ SPDK_TEST_CRYPTO=1 00:01:03.095 ++ SPDK_TEST_REDUCE=1 00:01:03.095 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:03.095 ++ SPDK_RUN_UBSAN=1 00:01:03.095 ++ SPDK_TEST_ACCEL=1 00:01:03.095 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:03.095 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:03.095 ++ RUN_NIGHTLY=1 00:01:03.095 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:03.095 + [[ -n '' ]] 00:01:03.095 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:03.355 + for M in /var/spdk/build-*-manifest.txt 00:01:03.355 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:03.355 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:03.355 + for M in /var/spdk/build-*-manifest.txt 00:01:03.355 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:03.355 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:03.355 ++ uname 00:01:03.355 + [[ Linux == \L\i\n\u\x ]] 00:01:03.355 + sudo dmesg -T 00:01:03.355 + sudo dmesg --clear 00:01:03.355 + dmesg_pid=3895186 00:01:03.355 + [[ Fedora Linux == FreeBSD ]] 00:01:03.355 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:03.355 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:03.355 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:03.355 + sudo dmesg -Tw 00:01:03.355 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:03.355 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:03.355 + [[ -x /usr/src/fio-static/fio ]] 00:01:03.355 + export FIO_BIN=/usr/src/fio-static/fio 00:01:03.355 + FIO_BIN=/usr/src/fio-static/fio 00:01:03.355 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:03.355 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:03.355 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:03.355 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:03.355 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:03.355 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:03.355 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:03.355 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:03.355 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:03.355 Test configuration: 00:01:03.355 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.355 SPDK_TEST_BLOCKDEV=1 00:01:03.355 SPDK_TEST_ISAL=1 00:01:03.355 SPDK_TEST_CRYPTO=1 00:01:03.355 SPDK_TEST_REDUCE=1 00:01:03.355 SPDK_TEST_VBDEV_COMPRESS=1 00:01:03.355 SPDK_RUN_UBSAN=1 00:01:03.355 SPDK_TEST_ACCEL=1 00:01:03.355 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:03.355 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:03.355 RUN_NIGHTLY=1 16:54:58 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:03.355 16:54:58 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:03.355 16:54:58 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:03.355 16:54:58 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:03.355 16:54:58 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.355 16:54:58 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.355 16:54:58 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.355 16:54:58 -- paths/export.sh@5 -- $ export PATH 00:01:03.355 16:54:58 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.355 16:54:58 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:03.355 16:54:58 -- common/autobuild_common.sh@447 -- $ date +%s 00:01:03.355 16:54:58 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721746498.XXXXXX 00:01:03.355 16:54:58 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721746498.RNJAWb 00:01:03.355 16:54:58 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:01:03.355 16:54:58 -- common/autobuild_common.sh@453 -- $ '[' -n v22.11.4 ']' 00:01:03.355 16:54:58 -- common/autobuild_common.sh@454 -- $ dirname /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:03.355 16:54:58 -- common/autobuild_common.sh@454 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk' 00:01:03.355 16:54:58 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:03.355 16:54:58 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:03.355 16:54:58 -- common/autobuild_common.sh@463 -- $ get_config_params 00:01:03.355 16:54:58 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:03.355 16:54:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:03.615 16:54:58 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build' 00:01:03.615 16:54:58 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:01:03.615 16:54:58 -- pm/common@17 -- $ local monitor 00:01:03.615 16:54:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.615 16:54:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.615 16:54:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.615 16:54:58 -- pm/common@21 -- $ date +%s 00:01:03.615 16:54:58 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.615 16:54:58 -- pm/common@21 -- $ date +%s 00:01:03.615 16:54:58 -- pm/common@25 -- $ sleep 1 00:01:03.615 16:54:58 -- pm/common@21 -- $ date +%s 00:01:03.615 16:54:58 -- pm/common@21 -- $ date +%s 00:01:03.615 16:54:58 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721746498 00:01:03.615 16:54:58 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721746498 00:01:03.615 16:54:58 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721746498 00:01:03.615 16:54:58 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721746498 00:01:03.615 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721746498_collect-vmstat.pm.log 00:01:03.615 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721746498_collect-cpu-load.pm.log 00:01:03.615 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721746498_collect-cpu-temp.pm.log 00:01:03.615 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721746498_collect-bmc-pm.bmc.pm.log 00:01:04.554 16:54:59 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:01:04.554 16:54:59 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:04.554 16:54:59 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:04.554 16:54:59 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:04.554 16:54:59 -- spdk/autobuild.sh@16 -- $ date -u 00:01:04.554 Tue Jul 23 02:54:59 PM UTC 2024 00:01:04.554 16:54:59 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:04.554 v24.09-pre-302-gb8378f94e 00:01:04.554 16:54:59 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:04.554 16:54:59 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:04.554 16:54:59 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:04.554 16:54:59 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:04.554 16:54:59 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:04.554 16:54:59 -- common/autotest_common.sh@10 -- $ set +x 00:01:04.554 ************************************ 00:01:04.554 START TEST ubsan 00:01:04.554 ************************************ 00:01:04.554 16:54:59 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:04.554 using ubsan 00:01:04.554 00:01:04.554 real 0m0.001s 00:01:04.554 user 0m0.001s 00:01:04.554 sys 0m0.000s 00:01:04.554 16:54:59 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:04.554 16:54:59 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:04.554 ************************************ 00:01:04.554 END TEST ubsan 00:01:04.554 ************************************ 00:01:04.554 16:54:59 -- common/autotest_common.sh@1142 -- $ return 0 00:01:04.554 16:54:59 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:04.554 16:54:59 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:04.554 16:54:59 -- common/autobuild_common.sh@439 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:04.554 16:54:59 -- common/autotest_common.sh@1099 -- $ '[' 2 -le 1 ']' 00:01:04.554 16:54:59 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:04.554 16:54:59 -- common/autotest_common.sh@10 -- $ set +x 00:01:04.554 ************************************ 00:01:04.554 START TEST build_native_dpdk 00:01:04.554 ************************************ 00:01:04.554 16:54:59 build_native_dpdk -- common/autotest_common.sh@1123 -- $ _build_native_dpdk 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/crypto-phy-autotest/dpdk 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/dpdk ]] 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:04.554 16:54:59 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/crypto-phy-autotest/dpdk log --oneline -n 5 00:01:04.554 caf0f5d395 version: 22.11.4 00:01:04.554 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:04.554 dc9c799c7d vhost: fix missing spinlock unlock 00:01:04.554 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:04.554 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 1 -eq 1 ]] 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@104 -- $ intel_ipsec_mb_ver=v0.54 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@105 -- $ intel_ipsec_mb_drv=crypto/aesni_mb 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@106 -- $ intel_ipsec_lib= 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@107 -- $ ge 22.11.4 21.11.0 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '>=' 21.11.0 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=>=' 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 22 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@350 -- $ local d=22 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@352 -- $ echo 22 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=22 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:01:04.814 16:54:59 build_native_dpdk -- scripts/common.sh@364 -- $ return 0 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@112 -- $ intel_ipsec_mb_ver=v1.0 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@113 -- $ intel_ipsec_mb_drv=crypto/ipsec_mb 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@114 -- $ intel_ipsec_lib=lib 00:01:04.814 16:54:59 build_native_dpdk -- common/autobuild_common.sh@116 -- $ git clone --branch v1.0 --depth 1 https://github.com/intel/intel-ipsec-mb.git /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb 00:01:04.814 Cloning into '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb'... 00:01:06.193 Note: switching to 'a1a289dabb23be78d6531de481ba6a417c67b0a5'. 00:01:06.193 00:01:06.193 You are in 'detached HEAD' state. You can look around, make experimental 00:01:06.193 changes and commit them, and you can discard any commits you make in this 00:01:06.193 state without impacting any branches by switching back to a branch. 00:01:06.193 00:01:06.193 If you want to create a new branch to retain commits you create, you may 00:01:06.193 do so (now or later) by using -c with the switch command. Example: 00:01:06.193 00:01:06.193 git switch -c 00:01:06.193 00:01:06.193 Or undo this operation with: 00:01:06.193 00:01:06.193 git switch - 00:01:06.193 00:01:06.193 Turn off this advice by setting config variable advice.detachedHead to false 00:01:06.193 00:01:06.193 16:55:01 build_native_dpdk -- common/autobuild_common.sh@117 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb 00:01:06.193 16:55:01 build_native_dpdk -- common/autobuild_common.sh@118 -- $ make -j72 all SHARED=y EXTRA_CFLAGS=-fPIC 00:01:06.193 make -C lib 00:01:06.193 make[1]: Entering directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:01:06.763 mkdir obj 00:01:06.763 nasm -MD obj/aes_keyexp_128.d -MT obj/aes_keyexp_128.o -o obj/aes_keyexp_128.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_keyexp_128.asm 00:01:06.763 nasm -MD obj/aes_keyexp_192.d -MT obj/aes_keyexp_192.o -o obj/aes_keyexp_192.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_keyexp_192.asm 00:01:06.763 nasm -MD obj/aes_keyexp_256.d -MT obj/aes_keyexp_256.o -o obj/aes_keyexp_256.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_keyexp_256.asm 00:01:06.763 nasm -MD obj/aes_cmac_subkey_gen.d -MT obj/aes_cmac_subkey_gen.o -o obj/aes_cmac_subkey_gen.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes_cmac_subkey_gen.asm 00:01:06.763 nasm -MD obj/save_xmms.d -MT obj/save_xmms.o -o obj/save_xmms.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/save_xmms.asm 00:01:06.763 nasm -MD obj/clear_regs_mem_fns.d -MT obj/clear_regs_mem_fns.o -o obj/clear_regs_mem_fns.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/clear_regs_mem_fns.asm 00:01:06.763 nasm -MD obj/const.d -MT obj/const.o -o obj/const.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/const.asm 00:01:06.763 nasm -MD obj/aes128_ecbenc_x3.d -MT obj/aes128_ecbenc_x3.o -o obj/aes128_ecbenc_x3.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/aes128_ecbenc_x3.asm 00:01:06.763 nasm -MD obj/zuc_common.d -MT obj/zuc_common.o -o obj/zuc_common.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/zuc_common.asm 00:01:06.763 nasm -MD obj/wireless_common.d -MT obj/wireless_common.o -o obj/wireless_common.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/wireless_common.asm 00:01:06.763 nasm -MD obj/constant_lookup.d -MT obj/constant_lookup.o -o obj/constant_lookup.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/constant_lookup.asm 00:01:06.763 nasm -MD obj/crc32_refl_const.d -MT obj/crc32_refl_const.o -o obj/crc32_refl_const.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/crc32_refl_const.asm 00:01:06.763 ld -r -z ibt -z shstk -o obj/save_xmms.o.tmp obj/save_xmms.o 00:01:06.763 nasm -MD obj/crc32_const.d -MT obj/crc32_const.o -o obj/crc32_const.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/crc32_const.asm 00:01:06.763 ld -r -z ibt -z shstk -o obj/const.o.tmp obj/const.o 00:01:06.763 ld -r -z ibt -z shstk -o obj/clear_regs_mem_fns.o.tmp obj/clear_regs_mem_fns.o 00:01:06.763 mv obj/save_xmms.o.tmp obj/save_xmms.o 00:01:06.763 nasm -MD obj/poly1305.d -MT obj/poly1305.o -o obj/poly1305.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP x86_64/poly1305.asm 00:01:06.763 ld -r -z ibt -z shstk -o obj/wireless_common.o.tmp obj/wireless_common.o 00:01:06.763 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/chacha20_poly1305.c -o obj/chacha20_poly1305.o 00:01:06.763 ld -r -z ibt -z shstk -o obj/crc32_refl_const.o.tmp obj/crc32_refl_const.o 00:01:06.763 mv obj/const.o.tmp obj/const.o 00:01:06.763 mv obj/clear_regs_mem_fns.o.tmp obj/clear_regs_mem_fns.o 00:01:06.763 mv obj/wireless_common.o.tmp obj/wireless_common.o 00:01:06.763 ld -r -z ibt -z shstk -o obj/crc32_const.o.tmp obj/crc32_const.o 00:01:06.763 nasm -MD obj/aes128_cbc_dec_by4_sse_no_aesni.d -MT obj/aes128_cbc_dec_by4_sse_no_aesni.o -o obj/aes128_cbc_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cbc_dec_by4_sse_no_aesni.asm 00:01:06.763 mv obj/crc32_refl_const.o.tmp obj/crc32_refl_const.o 00:01:06.763 nasm -MD obj/aes192_cbc_dec_by4_sse_no_aesni.d -MT obj/aes192_cbc_dec_by4_sse_no_aesni.o -o obj/aes192_cbc_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes192_cbc_dec_by4_sse_no_aesni.asm 00:01:06.763 mv obj/crc32_const.o.tmp obj/crc32_const.o 00:01:06.763 nasm -MD obj/aes256_cbc_dec_by4_sse_no_aesni.d -MT obj/aes256_cbc_dec_by4_sse_no_aesni.o -o obj/aes256_cbc_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cbc_dec_by4_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes_cbc_enc_128_x4_no_aesni.d -MT obj/aes_cbc_enc_128_x4_no_aesni.o -o obj/aes_cbc_enc_128_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbc_enc_128_x4_no_aesni.asm 00:01:06.763 ld -r -z ibt -z shstk -o obj/constant_lookup.o.tmp obj/constant_lookup.o 00:01:06.763 nasm -MD obj/aes_cbc_enc_192_x4_no_aesni.d -MT obj/aes_cbc_enc_192_x4_no_aesni.o -o obj/aes_cbc_enc_192_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbc_enc_192_x4_no_aesni.asm 00:01:06.763 nasm -MD obj/aes_cbc_enc_256_x4_no_aesni.d -MT obj/aes_cbc_enc_256_x4_no_aesni.o -o obj/aes_cbc_enc_256_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbc_enc_256_x4_no_aesni.asm 00:01:06.763 nasm -MD obj/aes128_cntr_by8_sse_no_aesni.d -MT obj/aes128_cntr_by8_sse_no_aesni.o -o obj/aes128_cntr_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cntr_by8_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes192_cntr_by8_sse_no_aesni.d -MT obj/aes192_cntr_by8_sse_no_aesni.o -o obj/aes192_cntr_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes192_cntr_by8_sse_no_aesni.asm 00:01:06.763 mv obj/constant_lookup.o.tmp obj/constant_lookup.o 00:01:06.763 nasm -MD obj/aes256_cntr_by8_sse_no_aesni.d -MT obj/aes256_cntr_by8_sse_no_aesni.o -o obj/aes256_cntr_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cntr_by8_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes_ecb_by4_sse_no_aesni.d -MT obj/aes_ecb_by4_sse_no_aesni.o -o obj/aes_ecb_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_ecb_by4_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes128_cntr_ccm_by8_sse_no_aesni.d -MT obj/aes128_cntr_ccm_by8_sse_no_aesni.o -o obj/aes128_cntr_ccm_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cntr_ccm_by8_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes256_cntr_ccm_by8_sse_no_aesni.d -MT obj/aes256_cntr_ccm_by8_sse_no_aesni.o -o obj/aes256_cntr_ccm_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cntr_ccm_by8_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/pon_sse_no_aesni.d -MT obj/pon_sse_no_aesni.o -o obj/pon_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/pon_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/zuc_sse_no_aesni.d -MT obj/zuc_sse_no_aesni.o -o obj/zuc_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/zuc_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes_cfb_sse_no_aesni.d -MT obj/aes_cfb_sse_no_aesni.o -o obj/aes_cfb_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cfb_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/aes128_cbc_mac_x4_no_aesni.d -MT obj/aes128_cbc_mac_x4_no_aesni.o -o obj/aes128_cbc_mac_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cbc_mac_x4_no_aesni.asm 00:01:06.763 nasm -MD obj/aes256_cbc_mac_x4_no_aesni.d -MT obj/aes256_cbc_mac_x4_no_aesni.o -o obj/aes256_cbc_mac_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes256_cbc_mac_x4_no_aesni.asm 00:01:06.763 nasm -MD obj/aes_xcbc_mac_128_x4_no_aesni.d -MT obj/aes_xcbc_mac_128_x4_no_aesni.o -o obj/aes_xcbc_mac_128_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_xcbc_mac_128_x4_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_flush_sse_no_aesni.o -o obj/mb_mgr_aes_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes_submit_sse_no_aesni.d -MT obj/mb_mgr_aes_submit_sse_no_aesni.o -o obj/mb_mgr_aes_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_submit_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes192_flush_sse_no_aesni.d -MT obj/mb_mgr_aes192_flush_sse_no_aesni.o -o obj/mb_mgr_aes192_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes192_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes192_submit_sse_no_aesni.d -MT obj/mb_mgr_aes192_submit_sse_no_aesni.o -o obj/mb_mgr_aes192_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes192_submit_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes256_flush_sse_no_aesni.d -MT obj/mb_mgr_aes256_flush_sse_no_aesni.o -o obj/mb_mgr_aes256_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes256_submit_sse_no_aesni.d -MT obj/mb_mgr_aes256_submit_sse_no_aesni.o -o obj/mb_mgr_aes256_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_submit_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.asm 00:01:06.763 nasm -MD obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.d -MT obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o -o obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_xcbc_flush_sse_no_aesni.asm 00:01:06.764 nasm -MD obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.d -MT obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o -o obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes_xcbc_submit_sse_no_aesni.asm 00:01:06.764 nasm -MD obj/mb_mgr_zuc_submit_flush_sse_no_aesni.d -MT obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o -o obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_zuc_submit_flush_sse_no_aesni.asm 00:01:06.764 nasm -MD obj/ethernet_fcs_sse_no_aesni.d -MT obj/ethernet_fcs_sse_no_aesni.o -o obj/ethernet_fcs_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/ethernet_fcs_sse_no_aesni.asm 00:01:06.764 ld -r -z ibt -z shstk -o obj/poly1305.o.tmp obj/poly1305.o 00:01:06.764 nasm -MD obj/crc16_x25_sse_no_aesni.d -MT obj/crc16_x25_sse_no_aesni.o -o obj/crc16_x25_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc16_x25_sse_no_aesni.asm 00:01:07.025 nasm -MD obj/aes_cbcs_1_9_enc_128_x4_no_aesni.d -MT obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o -o obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes_cbcs_1_9_enc_128_x4_no_aesni.asm 00:01:07.025 nasm -MD obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.d -MT obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o -o obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/aes128_cbcs_1_9_dec_by4_sse_no_aesni.asm 00:01:07.025 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_submit_sse.d -MT obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes128_cbcs_1_9_submit_sse.asm 00:01:07.025 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_flush_sse.d -MT obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes128_cbcs_1_9_flush_sse.asm 00:01:07.025 mv obj/poly1305.o.tmp obj/poly1305.o 00:01:07.025 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.d -MT obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/ethernet_fcs_sse_no_aesni.o.tmp obj/ethernet_fcs_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_flush_sse_no_aesni.o 00:01:07.025 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.d -MT obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.asm 00:01:07.025 mv obj/ethernet_fcs_sse_no_aesni.o.tmp obj/ethernet_fcs_sse_no_aesni.o 00:01:07.025 nasm -MD obj/crc32_refl_by8_sse_no_aesni.d -MT obj/crc32_refl_by8_sse_no_aesni.o -o obj/crc32_refl_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_refl_by8_sse_no_aesni.asm 00:01:07.025 mv obj/mb_mgr_aes_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_flush_sse_no_aesni.o 00:01:07.025 nasm -MD obj/crc32_by8_sse_no_aesni.d -MT obj/crc32_by8_sse_no_aesni.o -o obj/crc32_by8_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_by8_sse_no_aesni.asm 00:01:07.025 nasm -MD obj/crc32_sctp_sse_no_aesni.d -MT obj/crc32_sctp_sse_no_aesni.o -o obj/crc32_sctp_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_sctp_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes192_submit_sse_no_aesni.o 00:01:07.025 nasm -MD obj/crc32_lte_sse_no_aesni.d -MT obj/crc32_lte_sse_no_aesni.o -o obj/crc32_lte_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_lte_sse_no_aesni.asm 00:01:07.025 nasm -MD obj/crc32_fp_sse_no_aesni.d -MT obj/crc32_fp_sse_no_aesni.o -o obj/crc32_fp_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_fp_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes256_submit_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes192_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes192_submit_sse_no_aesni.o 00:01:07.025 nasm -MD obj/crc32_iuup_sse_no_aesni.d -MT obj/crc32_iuup_sse_no_aesni.o -o obj/crc32_iuup_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_iuup_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes192_flush_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_flush_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/crc16_x25_sse_no_aesni.o.tmp obj/crc16_x25_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/crc32_sctp_sse_no_aesni.o.tmp obj/crc32_sctp_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes256_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes256_submit_sse_no_aesni.o 00:01:07.025 nasm -MD obj/crc32_wimax_sse_no_aesni.d -MT obj/crc32_wimax_sse_no_aesni.o -o obj/crc32_wimax_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/crc32_wimax_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes192_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes192_flush_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/crc32_iuup_sse_no_aesni.o.tmp obj/crc32_iuup_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes_cmac_subkey_gen.o.tmp obj/aes_cmac_subkey_gen.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/crc32_lte_sse_no_aesni.o.tmp obj/crc32_lte_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes256_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_flush_sse_no_aesni.o 00:01:07.025 mv obj/crc16_x25_sse_no_aesni.o.tmp obj/crc16_x25_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/crc32_fp_sse_no_aesni.o.tmp obj/crc32_fp_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o 00:01:07.025 mv obj/crc32_iuup_sse_no_aesni.o.tmp obj/crc32_iuup_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_submit_sse_no_aesni.o 00:01:07.025 mv obj/crc32_sctp_sse_no_aesni.o.tmp obj/crc32_sctp_sse_no_aesni.o 00:01:07.025 mv obj/aes_cmac_subkey_gen.o.tmp obj/aes_cmac_subkey_gen.o 00:01:07.025 mv obj/crc32_lte_sse_no_aesni.o.tmp obj/crc32_lte_sse_no_aesni.o 00:01:07.025 mv obj/crc32_fp_sse_no_aesni.o.tmp obj/crc32_fp_sse_no_aesni.o 00:01:07.025 nasm -MD obj/gcm128_sse_no_aesni.d -MT obj/gcm128_sse_no_aesni.o -o obj/gcm128_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/gcm128_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes_keyexp_192.o.tmp obj/aes_keyexp_192.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/crc32_wimax_sse_no_aesni.o.tmp obj/crc32_wimax_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_submit_sse_no_aesni.o 00:01:07.025 nasm -MD obj/gcm192_sse_no_aesni.d -MT obj/gcm192_sse_no_aesni.o -o obj/gcm192_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/gcm192_sse_no_aesni.asm 00:01:07.025 mv obj/aes_keyexp_192.o.tmp obj/aes_keyexp_192.o 00:01:07.025 mv obj/crc32_wimax_sse_no_aesni.o.tmp obj/crc32_wimax_sse_no_aesni.o 00:01:07.025 nasm -MD obj/gcm256_sse_no_aesni.d -MT obj/gcm256_sse_no_aesni.o -o obj/gcm256_sse_no_aesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/gcm256_sse_no_aesni.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes128_ecbenc_x3.o.tmp obj/aes128_ecbenc_x3.o 00:01:07.025 nasm -MD obj/aes128_cbc_dec_by4_sse.d -MT obj/aes128_cbc_dec_by4_sse.o -o obj/aes128_cbc_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_dec_by4_sse.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o 00:01:07.025 mv obj/aes128_ecbenc_x3.o.tmp obj/aes128_ecbenc_x3.o 00:01:07.025 nasm -MD obj/aes128_cbc_dec_by8_sse.d -MT obj/aes128_cbc_dec_by8_sse.o -o obj/aes128_cbc_dec_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_dec_by8_sse.asm 00:01:07.025 mv obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o 00:01:07.025 nasm -MD obj/aes192_cbc_dec_by4_sse.d -MT obj/aes192_cbc_dec_by4_sse.o -o obj/aes192_cbc_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes192_cbc_dec_by4_sse.asm 00:01:07.025 nasm -MD obj/aes192_cbc_dec_by8_sse.d -MT obj/aes192_cbc_dec_by8_sse.o -o obj/aes192_cbc_dec_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes192_cbc_dec_by8_sse.asm 00:01:07.025 nasm -MD obj/aes256_cbc_dec_by4_sse.d -MT obj/aes256_cbc_dec_by4_sse.o -o obj/aes256_cbc_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_dec_by4_sse.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by4_sse.o.tmp obj/aes128_cbc_dec_by4_sse.o 00:01:07.025 nasm -MD obj/aes256_cbc_dec_by8_sse.d -MT obj/aes256_cbc_dec_by8_sse.o -o obj/aes256_cbc_dec_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_dec_by8_sse.asm 00:01:07.025 nasm -MD obj/aes_cbc_enc_128_x4.d -MT obj/aes_cbc_enc_128_x4.o -o obj/aes_cbc_enc_128_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_128_x4.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by4_sse.o.tmp obj/aes192_cbc_dec_by4_sse.o 00:01:07.025 mv obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o 00:01:07.025 mv obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o 00:01:07.025 mv obj/aes128_cbc_dec_by4_sse.o.tmp obj/aes128_cbc_dec_by4_sse.o 00:01:07.025 nasm -MD obj/aes_cbc_enc_192_x4.d -MT obj/aes_cbc_enc_192_x4.o -o obj/aes_cbc_enc_192_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_192_x4.asm 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes_keyexp_128.o.tmp obj/aes_keyexp_128.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes_keyexp_256.o.tmp obj/aes_keyexp_256.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by8_sse.o.tmp obj/aes128_cbc_dec_by8_sse.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by8_sse.o.tmp obj/aes192_cbc_dec_by8_sse.o 00:01:07.025 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by4_sse.o.tmp obj/aes256_cbc_dec_by4_sse.o 00:01:07.025 mv obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o 00:01:07.025 mv obj/aes192_cbc_dec_by4_sse.o.tmp obj/aes192_cbc_dec_by4_sse.o 00:01:07.026 mv obj/aes_keyexp_128.o.tmp obj/aes_keyexp_128.o 00:01:07.026 mv obj/aes_keyexp_256.o.tmp obj/aes_keyexp_256.o 00:01:07.026 mv obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o 00:01:07.026 mv obj/aes256_cbc_dec_by4_sse.o.tmp obj/aes256_cbc_dec_by4_sse.o 00:01:07.026 nasm -MD obj/aes_cbc_enc_256_x4.d -MT obj/aes_cbc_enc_256_x4.o -o obj/aes_cbc_enc_256_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_256_x4.asm 00:01:07.026 mv obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o 00:01:07.026 mv obj/aes128_cbc_dec_by8_sse.o.tmp obj/aes128_cbc_dec_by8_sse.o 00:01:07.026 mv obj/aes192_cbc_dec_by8_sse.o.tmp obj/aes192_cbc_dec_by8_sse.o 00:01:07.026 nasm -MD obj/aes_cbc_enc_128_x8_sse.d -MT obj/aes_cbc_enc_128_x8_sse.o -o obj/aes_cbc_enc_128_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_128_x8_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by8_sse.o.tmp obj/aes256_cbc_dec_by8_sse.o 00:01:07.026 nasm -MD obj/aes_cbc_enc_192_x8_sse.d -MT obj/aes_cbc_enc_192_x8_sse.o -o obj/aes_cbc_enc_192_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_192_x8_sse.asm 00:01:07.026 nasm -MD obj/aes_cbc_enc_256_x8_sse.d -MT obj/aes_cbc_enc_256_x8_sse.o -o obj/aes_cbc_enc_256_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbc_enc_256_x8_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x4.o.tmp obj/aes_cbc_enc_128_x4.o 00:01:07.026 nasm -MD obj/pon_sse.d -MT obj/pon_sse.o -o obj/pon_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/pon_sse.asm 00:01:07.026 nasm -MD obj/aes128_cntr_by8_sse.d -MT obj/aes128_cntr_by8_sse.o -o obj/aes128_cntr_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cntr_by8_sse.asm 00:01:07.026 mv obj/aes256_cbc_dec_by8_sse.o.tmp obj/aes256_cbc_dec_by8_sse.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x4.o.tmp obj/aes_cbc_enc_192_x4.o 00:01:07.026 nasm -MD obj/aes192_cntr_by8_sse.d -MT obj/aes192_cntr_by8_sse.o -o obj/aes192_cntr_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes192_cntr_by8_sse.asm 00:01:07.026 nasm -MD obj/aes256_cntr_by8_sse.d -MT obj/aes256_cntr_by8_sse.o -o obj/aes256_cntr_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cntr_by8_sse.asm 00:01:07.026 nasm -MD obj/aes_ecb_by4_sse.d -MT obj/aes_ecb_by4_sse.o -o obj/aes_ecb_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_ecb_by4_sse.asm 00:01:07.026 mv obj/aes_cbc_enc_128_x4.o.tmp obj/aes_cbc_enc_128_x4.o 00:01:07.026 nasm -MD obj/aes128_cntr_ccm_by8_sse.d -MT obj/aes128_cntr_ccm_by8_sse.o -o obj/aes128_cntr_ccm_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cntr_ccm_by8_sse.asm 00:01:07.026 nasm -MD obj/aes256_cntr_ccm_by8_sse.d -MT obj/aes256_cntr_ccm_by8_sse.o -o obj/aes256_cntr_ccm_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cntr_ccm_by8_sse.asm 00:01:07.026 nasm -MD obj/aes_cfb_sse.d -MT obj/aes_cfb_sse.o -o obj/aes_cfb_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cfb_sse.asm 00:01:07.026 mv obj/aes_cbc_enc_192_x4.o.tmp obj/aes_cbc_enc_192_x4.o 00:01:07.026 nasm -MD obj/aes128_cbc_mac_x4.d -MT obj/aes128_cbc_mac_x4.o -o obj/aes128_cbc_mac_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_mac_x4.asm 00:01:07.026 nasm -MD obj/aes256_cbc_mac_x4.d -MT obj/aes256_cbc_mac_x4.o -o obj/aes256_cbc_mac_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_mac_x4.asm 00:01:07.026 nasm -MD obj/aes128_cbc_mac_x8_sse.d -MT obj/aes128_cbc_mac_x8_sse.o -o obj/aes128_cbc_mac_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbc_mac_x8_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x4.o.tmp obj/aes_cbc_enc_256_x4.o 00:01:07.026 nasm -MD obj/aes256_cbc_mac_x8_sse.d -MT obj/aes256_cbc_mac_x8_sse.o -o obj/aes256_cbc_mac_x8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes256_cbc_mac_x8_sse.asm 00:01:07.026 nasm -MD obj/aes_xcbc_mac_128_x4.d -MT obj/aes_xcbc_mac_128_x4.o -o obj/aes_xcbc_mac_128_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_xcbc_mac_128_x4.asm 00:01:07.026 nasm -MD obj/md5_x4x2_sse.d -MT obj/md5_x4x2_sse.o -o obj/md5_x4x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/md5_x4x2_sse.asm 00:01:07.026 nasm -MD obj/sha1_mult_sse.d -MT obj/sha1_mult_sse.o -o obj/sha1_mult_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha1_mult_sse.asm 00:01:07.026 mv obj/aes_cbc_enc_256_x4.o.tmp obj/aes_cbc_enc_256_x4.o 00:01:07.026 nasm -MD obj/sha1_one_block_sse.d -MT obj/sha1_one_block_sse.o -o obj/sha1_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha1_one_block_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x8_sse.o.tmp obj/aes_cbc_enc_256_x8_sse.o 00:01:07.026 nasm -MD obj/sha224_one_block_sse.d -MT obj/sha224_one_block_sse.o -o obj/sha224_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha224_one_block_sse.asm 00:01:07.026 nasm -MD obj/sha256_one_block_sse.d -MT obj/sha256_one_block_sse.o -o obj/sha256_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha256_one_block_sse.asm 00:01:07.026 nasm -MD obj/sha384_one_block_sse.d -MT obj/sha384_one_block_sse.o -o obj/sha384_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha384_one_block_sse.asm 00:01:07.026 mv obj/aes_cbc_enc_256_x8_sse.o.tmp obj/aes_cbc_enc_256_x8_sse.o 00:01:07.026 nasm -MD obj/sha512_one_block_sse.d -MT obj/sha512_one_block_sse.o -o obj/sha512_one_block_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha512_one_block_sse.asm 00:01:07.026 nasm -MD obj/sha512_x2_sse.d -MT obj/sha512_x2_sse.o -o obj/sha512_x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha512_x2_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x8_sse.o.tmp obj/aes_cbc_enc_128_x8_sse.o 00:01:07.026 nasm -MD obj/sha_256_mult_sse.d -MT obj/sha_256_mult_sse.o -o obj/sha_256_mult_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha_256_mult_sse.asm 00:01:07.026 nasm -MD obj/sha1_ni_x2_sse.d -MT obj/sha1_ni_x2_sse.o -o obj/sha1_ni_x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha1_ni_x2_sse.asm 00:01:07.026 nasm -MD obj/sha256_ni_x2_sse.d -MT obj/sha256_ni_x2_sse.o -o obj/sha256_ni_x2_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/sha256_ni_x2_sse.asm 00:01:07.026 nasm -MD obj/zuc_sse.d -MT obj/zuc_sse.o -o obj/zuc_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/zuc_sse.asm 00:01:07.026 mv obj/aes_cbc_enc_128_x8_sse.o.tmp obj/aes_cbc_enc_128_x8_sse.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x8_sse.o.tmp obj/aes_cbc_enc_192_x8_sse.o 00:01:07.026 nasm -MD obj/zuc_sse_gfni.d -MT obj/zuc_sse_gfni.o -o obj/zuc_sse_gfni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/zuc_sse_gfni.asm 00:01:07.026 nasm -MD obj/mb_mgr_aes_flush_sse.d -MT obj/mb_mgr_aes_flush_sse.o -o obj/mb_mgr_aes_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_flush_sse.asm 00:01:07.026 nasm -MD obj/mb_mgr_aes_submit_sse.d -MT obj/mb_mgr_aes_submit_sse.o -o obj/mb_mgr_aes_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_submit_sse.asm 00:01:07.026 mv obj/aes_cbc_enc_192_x8_sse.o.tmp obj/aes_cbc_enc_192_x8_sse.o 00:01:07.026 nasm -MD obj/mb_mgr_aes192_flush_sse.d -MT obj/mb_mgr_aes192_flush_sse.o -o obj/mb_mgr_aes192_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_flush_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cfb_sse.o.tmp obj/aes_cfb_sse.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x8_sse.o.tmp obj/aes128_cbc_mac_x8_sse.o 00:01:07.026 nasm -MD obj/mb_mgr_aes192_submit_sse.d -MT obj/mb_mgr_aes192_submit_sse.o -o obj/mb_mgr_aes192_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_submit_sse.asm 00:01:07.026 nasm -MD obj/mb_mgr_aes256_flush_sse.d -MT obj/mb_mgr_aes256_flush_sse.o -o obj/mb_mgr_aes256_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_flush_sse.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o 00:01:07.026 mv obj/aes_cfb_sse.o.tmp obj/aes_cfb_sse.o 00:01:07.026 mv obj/aes128_cbc_mac_x8_sse.o.tmp obj/aes128_cbc_mac_x8_sse.o 00:01:07.026 nasm -MD obj/mb_mgr_aes256_submit_sse.d -MT obj/mb_mgr_aes256_submit_sse.o -o obj/mb_mgr_aes256_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_submit_sse.asm 00:01:07.026 nasm -MD obj/mb_mgr_aes_flush_sse_x8.d -MT obj/mb_mgr_aes_flush_sse_x8.o -o obj/mb_mgr_aes_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_flush_sse_x8.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/sha384_one_block_sse.o.tmp obj/sha384_one_block_sse.o 00:01:07.026 mv obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/sha224_one_block_sse.o.tmp obj/sha224_one_block_sse.o 00:01:07.026 mv obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_cfb_sse_no_aesni.o.tmp obj/aes_cfb_sse_no_aesni.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_ecb_by4_sse.o.tmp obj/aes_ecb_by4_sse.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/sha1_one_block_sse.o.tmp obj/sha1_one_block_sse.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/sha256_one_block_sse.o.tmp obj/sha256_one_block_sse.o 00:01:07.026 mv obj/sha384_one_block_sse.o.tmp obj/sha384_one_block_sse.o 00:01:07.026 nasm -MD obj/mb_mgr_aes_submit_sse_x8.d -MT obj/mb_mgr_aes_submit_sse_x8.o -o obj/mb_mgr_aes_submit_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_submit_sse_x8.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x8_sse.o.tmp obj/aes256_cbc_mac_x8_sse.o 00:01:07.026 mv obj/sha224_one_block_sse.o.tmp obj/sha224_one_block_sse.o 00:01:07.026 mv obj/aes_cfb_sse_no_aesni.o.tmp obj/aes_cfb_sse_no_aesni.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/sha512_one_block_sse.o.tmp obj/sha512_one_block_sse.o 00:01:07.026 mv obj/aes_ecb_by4_sse.o.tmp obj/aes_ecb_by4_sse.o 00:01:07.026 mv obj/sha1_one_block_sse.o.tmp obj/sha1_one_block_sse.o 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x4.o.tmp obj/aes256_cbc_mac_x4.o 00:01:07.026 mv obj/aes256_cbc_mac_x8_sse.o.tmp obj/aes256_cbc_mac_x8_sse.o 00:01:07.026 nasm -MD obj/mb_mgr_aes192_flush_sse_x8.d -MT obj/mb_mgr_aes192_flush_sse_x8.o -o obj/mb_mgr_aes192_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_flush_sse_x8.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x4.o.tmp obj/aes128_cbc_mac_x4.o 00:01:07.026 mv obj/sha512_one_block_sse.o.tmp obj/sha512_one_block_sse.o 00:01:07.026 nasm -MD obj/mb_mgr_aes192_submit_sse_x8.d -MT obj/mb_mgr_aes192_submit_sse_x8.o -o obj/mb_mgr_aes192_submit_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes192_submit_sse_x8.asm 00:01:07.026 ld -r -z ibt -z shstk -o obj/aes_xcbc_mac_128_x4.o.tmp obj/aes_xcbc_mac_128_x4.o 00:01:07.026 mv obj/sha256_one_block_sse.o.tmp obj/sha256_one_block_sse.o 00:01:07.026 mv obj/aes256_cbc_mac_x4.o.tmp obj/aes256_cbc_mac_x4.o 00:01:07.026 mv obj/aes128_cbc_mac_x4.o.tmp obj/aes128_cbc_mac_x4.o 00:01:07.027 nasm -MD obj/mb_mgr_aes256_flush_sse_x8.d -MT obj/mb_mgr_aes256_flush_sse_x8.o -o obj/mb_mgr_aes256_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_flush_sse_x8.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o 00:01:07.027 mv obj/aes_xcbc_mac_128_x4.o.tmp obj/aes_xcbc_mac_128_x4.o 00:01:07.027 nasm -MD obj/mb_mgr_aes256_submit_sse_x8.d -MT obj/mb_mgr_aes256_submit_sse_x8.o -o obj/mb_mgr_aes256_submit_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_submit_sse_x8.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_sse.d -MT obj/mb_mgr_aes_cmac_submit_flush_sse.o -o obj/mb_mgr_aes_cmac_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_cmac_submit_flush_sse.asm 00:01:07.027 mv obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o 00:01:07.027 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_sse.d -MT obj/mb_mgr_aes256_cmac_submit_flush_sse.o -o obj/mb_mgr_aes256_cmac_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_cmac_submit_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_sse_x8.d -MT obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o -o obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_cmac_submit_flush_sse_x8.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.d -MT obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o -o obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_cmac_submit_flush_sse_x8.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_sse.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_ccm_auth_submit_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_sse.o.tmp obj/mb_mgr_aes192_flush_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_ccm_auth_submit_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes_xcbc_flush_sse.d -MT obj/mb_mgr_aes_xcbc_flush_sse.o -o obj/mb_mgr_aes_xcbc_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_xcbc_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_aes_xcbc_submit_sse.d -MT obj/mb_mgr_aes_xcbc_submit_sse.o -o obj/mb_mgr_aes_xcbc_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_aes_xcbc_submit_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/sha256_ni_x2_sse.o.tmp obj/sha256_ni_x2_sse.o 00:01:07.027 mv obj/mb_mgr_aes192_flush_sse.o.tmp obj/mb_mgr_aes192_flush_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_md5_flush_sse.d -MT obj/mb_mgr_hmac_md5_flush_sse.o -o obj/mb_mgr_hmac_md5_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_md5_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_md5_submit_sse.d -MT obj/mb_mgr_hmac_md5_submit_sse.o -o obj/mb_mgr_hmac_md5_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_md5_submit_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_flush_sse.d -MT obj/mb_mgr_hmac_flush_sse.o -o obj/mb_mgr_hmac_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_flush_sse.asm 00:01:07.027 mv obj/sha256_ni_x2_sse.o.tmp obj/sha256_ni_x2_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_submit_sse.d -MT obj/mb_mgr_hmac_submit_sse.o -o obj/mb_mgr_hmac_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_submit_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_224_flush_sse.d -MT obj/mb_mgr_hmac_sha_224_flush_sse.o -o obj/mb_mgr_hmac_sha_224_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_flush_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/aes128_cntr_ccm_by8_sse.o.tmp obj/aes128_cntr_ccm_by8_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/sha1_ni_x2_sse.o.tmp obj/sha1_ni_x2_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_sse_x8.o.tmp obj/mb_mgr_aes_submit_sse_x8.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_sse.o.tmp obj/mb_mgr_aes_flush_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_sse.o.tmp obj/mb_mgr_aes256_submit_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_224_submit_sse.d -MT obj/mb_mgr_hmac_sha_224_submit_sse.o -o obj/mb_mgr_hmac_sha_224_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_submit_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_sse.o.tmp obj/mb_mgr_aes_submit_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_256_flush_sse.d -MT obj/mb_mgr_hmac_sha_256_flush_sse.o -o obj/mb_mgr_hmac_sha_256_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_flush_sse.asm 00:01:07.027 mv obj/aes128_cntr_ccm_by8_sse.o.tmp obj/aes128_cntr_ccm_by8_sse.o 00:01:07.027 mv obj/sha1_ni_x2_sse.o.tmp obj/sha1_ni_x2_sse.o 00:01:07.027 mv obj/mb_mgr_aes_submit_sse_x8.o.tmp obj/mb_mgr_aes_submit_sse_x8.o 00:01:07.027 mv obj/mb_mgr_aes_flush_sse.o.tmp obj/mb_mgr_aes_flush_sse.o 00:01:07.027 mv obj/mb_mgr_aes256_submit_sse.o.tmp obj/mb_mgr_aes256_submit_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_256_submit_sse.d -MT obj/mb_mgr_hmac_sha_256_submit_sse.o -o obj/mb_mgr_hmac_sha_256_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_submit_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_sse.o.tmp obj/mb_mgr_aes192_submit_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_sse.o.tmp obj/mb_mgr_aes256_flush_sse.o 00:01:07.027 mv obj/mb_mgr_aes_submit_sse.o.tmp obj/mb_mgr_aes_submit_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_384_flush_sse.d -MT obj/mb_mgr_hmac_sha_384_flush_sse.o -o obj/mb_mgr_hmac_sha_384_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_384_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_384_submit_sse.d -MT obj/mb_mgr_hmac_sha_384_submit_sse.o -o obj/mb_mgr_hmac_sha_384_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_384_submit_sse.asm 00:01:07.027 mv obj/mb_mgr_aes192_submit_sse.o.tmp obj/mb_mgr_aes192_submit_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_sse_x8.o.tmp obj/mb_mgr_aes_flush_sse_x8.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_512_flush_sse.d -MT obj/mb_mgr_hmac_sha_512_flush_sse.o -o obj/mb_mgr_hmac_sha_512_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_512_flush_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_sse_x8.o.tmp obj/mb_mgr_aes256_flush_sse_x8.o 00:01:07.027 mv obj/mb_mgr_aes256_flush_sse.o.tmp obj/mb_mgr_aes256_flush_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_512_submit_sse.d -MT obj/mb_mgr_hmac_sha_512_submit_sse.o -o obj/mb_mgr_hmac_sha_512_submit_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_512_submit_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_flush_ni_sse.d -MT obj/mb_mgr_hmac_flush_ni_sse.o -o obj/mb_mgr_hmac_flush_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_flush_ni_sse.asm 00:01:07.027 mv obj/mb_mgr_aes_flush_sse_x8.o.tmp obj/mb_mgr_aes_flush_sse_x8.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_submit_ni_sse.d -MT obj/mb_mgr_hmac_submit_ni_sse.o -o obj/mb_mgr_hmac_submit_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_submit_ni_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_224_flush_ni_sse.d -MT obj/mb_mgr_hmac_sha_224_flush_ni_sse.o -o obj/mb_mgr_hmac_sha_224_flush_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_flush_ni_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/sha_256_mult_sse.o.tmp obj/sha_256_mult_sse.o 00:01:07.027 mv obj/mb_mgr_aes256_flush_sse_x8.o.tmp obj/mb_mgr_aes256_flush_sse_x8.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_224_submit_ni_sse.d -MT obj/mb_mgr_hmac_sha_224_submit_ni_sse.o -o obj/mb_mgr_hmac_sha_224_submit_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_224_submit_ni_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_256_flush_ni_sse.d -MT obj/mb_mgr_hmac_sha_256_flush_ni_sse.o -o obj/mb_mgr_hmac_sha_256_flush_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_flush_ni_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_sse_x8.o.tmp obj/mb_mgr_aes192_flush_sse_x8.o 00:01:07.027 nasm -MD obj/mb_mgr_hmac_sha_256_submit_ni_sse.d -MT obj/mb_mgr_hmac_sha_256_submit_ni_sse.o -o obj/mb_mgr_hmac_sha_256_submit_ni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_hmac_sha_256_submit_ni_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_sse_x8.o.tmp obj/mb_mgr_aes192_submit_sse_x8.o 00:01:07.027 mv obj/sha_256_mult_sse.o.tmp obj/sha_256_mult_sse.o 00:01:07.027 nasm -MD obj/mb_mgr_zuc_submit_flush_sse.d -MT obj/mb_mgr_zuc_submit_flush_sse.o -o obj/mb_mgr_zuc_submit_flush_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_zuc_submit_flush_sse.asm 00:01:07.027 nasm -MD obj/mb_mgr_zuc_submit_flush_gfni_sse.d -MT obj/mb_mgr_zuc_submit_flush_gfni_sse.o -o obj/mb_mgr_zuc_submit_flush_gfni_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/mb_mgr_zuc_submit_flush_gfni_sse.asm 00:01:07.027 mv obj/mb_mgr_aes192_flush_sse_x8.o.tmp obj/mb_mgr_aes192_flush_sse_x8.o 00:01:07.027 nasm -MD obj/ethernet_fcs_sse.d -MT obj/ethernet_fcs_sse.o -o obj/ethernet_fcs_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/ethernet_fcs_sse.asm 00:01:07.027 mv obj/mb_mgr_aes192_submit_sse_x8.o.tmp obj/mb_mgr_aes192_submit_sse_x8.o 00:01:07.027 nasm -MD obj/crc16_x25_sse.d -MT obj/crc16_x25_sse.o -o obj/crc16_x25_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc16_x25_sse.asm 00:01:07.027 nasm -MD obj/crc32_sctp_sse.d -MT obj/crc32_sctp_sse.o -o obj/crc32_sctp_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_sctp_sse.asm 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_sse_x8.o.tmp obj/mb_mgr_aes256_submit_sse_x8.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/aes256_cntr_ccm_by8_sse.o.tmp obj/aes256_cntr_ccm_by8_sse.o 00:01:07.027 nasm -MD obj/aes_cbcs_1_9_enc_128_x4.d -MT obj/aes_cbcs_1_9_enc_128_x4.o -o obj/aes_cbcs_1_9_enc_128_x4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes_cbcs_1_9_enc_128_x4.asm 00:01:07.027 mv obj/mb_mgr_aes256_submit_sse_x8.o.tmp obj/mb_mgr_aes256_submit_sse_x8.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_sse.o.tmp obj/mb_mgr_aes_xcbc_submit_sse.o 00:01:07.027 mv obj/aes256_cntr_ccm_by8_sse.o.tmp obj/aes256_cntr_ccm_by8_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/sha512_x2_sse.o.tmp obj/sha512_x2_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/ethernet_fcs_sse.o.tmp obj/ethernet_fcs_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_flush_sse.o.tmp obj/mb_mgr_aes_xcbc_flush_sse.o 00:01:07.027 ld -r -z ibt -z shstk -o obj/crc16_x25_sse.o.tmp obj/crc16_x25_sse.o 00:01:07.027 nasm -MD obj/aes128_cbcs_1_9_dec_by4_sse.d -MT obj/aes128_cbcs_1_9_dec_by4_sse.o -o obj/aes128_cbcs_1_9_dec_by4_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/aes128_cbcs_1_9_dec_by4_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_sctp_sse.o.tmp obj/crc32_sctp_sse.o 00:01:07.028 mv obj/mb_mgr_aes_xcbc_submit_sse.o.tmp obj/mb_mgr_aes_xcbc_submit_sse.o 00:01:07.028 mv obj/sha512_x2_sse.o.tmp obj/sha512_x2_sse.o 00:01:07.028 mv obj/ethernet_fcs_sse.o.tmp obj/ethernet_fcs_sse.o 00:01:07.028 mv obj/mb_mgr_aes_xcbc_flush_sse.o.tmp obj/mb_mgr_aes_xcbc_flush_sse.o 00:01:07.028 mv obj/crc16_x25_sse.o.tmp obj/crc16_x25_sse.o 00:01:07.028 nasm -MD obj/crc32_refl_by8_sse.d -MT obj/crc32_refl_by8_sse.o -o obj/crc32_refl_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_refl_by8_sse.asm 00:01:07.028 mv obj/crc32_sctp_sse.o.tmp obj/crc32_sctp_sse.o 00:01:07.028 nasm -MD obj/crc32_by8_sse.d -MT obj/crc32_by8_sse.o -o obj/crc32_by8_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_by8_sse.asm 00:01:07.028 nasm -MD obj/crc32_lte_sse.d -MT obj/crc32_lte_sse.o -o obj/crc32_lte_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_lte_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_flush_sse.o.tmp obj/mb_mgr_hmac_md5_flush_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_sse.o.tmp obj/mb_mgr_hmac_flush_sse.o 00:01:07.028 nasm -MD obj/crc32_fp_sse.d -MT obj/crc32_fp_sse.o -o obj/crc32_fp_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_fp_sse.asm 00:01:07.028 nasm -MD obj/crc32_iuup_sse.d -MT obj/crc32_iuup_sse.o -o obj/crc32_iuup_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_iuup_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_sse.o 00:01:07.028 nasm -MD obj/crc32_wimax_sse.d -MT obj/crc32_wimax_sse.o -o obj/crc32_wimax_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/crc32_wimax_sse.asm 00:01:07.028 mv obj/mb_mgr_hmac_md5_flush_sse.o.tmp obj/mb_mgr_hmac_md5_flush_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_flush_sse.o.tmp obj/mb_mgr_hmac_flush_sse.o 00:01:07.028 nasm -MD obj/chacha20_sse.d -MT obj/chacha20_sse.o -o obj/chacha20_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/chacha20_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_submit_sse.o.tmp obj/mb_mgr_hmac_md5_submit_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/aes128_cbcs_1_9_dec_by4_sse.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_sha_224_flush_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_sse.o 00:01:07.028 mv obj/mb_mgr_aes_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse.o 00:01:07.028 nasm -MD obj/memcpy_sse.d -MT obj/memcpy_sse.o -o obj/memcpy_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/memcpy_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_lte_sse.o.tmp obj/crc32_lte_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_sse.o.tmp obj/mb_mgr_hmac_sha_512_flush_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_ni_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_md5_submit_sse.o.tmp obj/mb_mgr_hmac_md5_submit_sse.o 00:01:07.028 mv obj/aes128_cbcs_1_9_dec_by4_sse.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_fp_sse.o.tmp obj/crc32_fp_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_ni_sse.o.tmp obj/mb_mgr_hmac_flush_ni_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_iuup_sse.o.tmp obj/crc32_iuup_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_wimax_sse.o.tmp obj/crc32_wimax_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_sha_256_flush_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_sse.o 00:01:07.028 mv obj/crc32_lte_sse.o.tmp obj/crc32_lte_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_sha_512_flush_sse.o.tmp obj/mb_mgr_hmac_sha_512_flush_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_sha_256_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_flush_ni_sse.o 00:01:07.028 mv obj/crc32_fp_sse.o.tmp obj/crc32_fp_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_flush_ni_sse.o.tmp obj/mb_mgr_hmac_flush_ni_sse.o 00:01:07.028 mv obj/crc32_iuup_sse.o.tmp obj/crc32_iuup_sse.o 00:01:07.028 mv obj/crc32_wimax_sse.o.tmp obj/crc32_wimax_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o 00:01:07.028 nasm -MD obj/gcm128_sse.d -MT obj/gcm128_sse.o -o obj/gcm128_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/gcm128_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_ni_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_ni_sse.o 00:01:07.028 mv obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o 00:01:07.028 nasm -MD obj/gcm192_sse.d -MT obj/gcm192_sse.o -o obj/gcm192_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/gcm192_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/aes_cbcs_1_9_enc_128_x4.o.tmp obj/aes_cbcs_1_9_enc_128_x4.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_refl_by8_sse.o.tmp obj/crc32_refl_by8_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/crc32_by8_sse.o.tmp obj/crc32_by8_sse.o 00:01:07.028 ld -r -z ibt -z shstk -o obj/memcpy_sse.o.tmp obj/memcpy_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_sha_224_flush_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_flush_ni_sse.o 00:01:07.028 mv obj/mb_mgr_hmac_sha_256_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_ni_sse.o 00:01:07.028 nasm -MD obj/gcm256_sse.d -MT obj/gcm256_sse.o -o obj/gcm256_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/gcm256_sse.asm 00:01:07.028 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_sse.o.tmp obj/mb_mgr_hmac_submit_sse.o 00:01:07.297 mv obj/aes_cbcs_1_9_enc_128_x4.o.tmp obj/aes_cbcs_1_9_enc_128_x4.o 00:01:07.297 mv obj/crc32_refl_by8_sse.o.tmp obj/crc32_refl_by8_sse.o 00:01:07.297 mv obj/crc32_by8_sse.o.tmp obj/crc32_by8_sse.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_ni_sse.o.tmp obj/mb_mgr_hmac_submit_ni_sse.o 00:01:07.297 mv obj/memcpy_sse.o.tmp obj/memcpy_sse.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_sse.o.tmp obj/mb_mgr_hmac_sha_384_submit_sse.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_sse.o 00:01:07.297 mv obj/mb_mgr_hmac_submit_sse.o.tmp obj/mb_mgr_hmac_submit_sse.o 00:01:07.297 nasm -MD obj/aes_cbc_enc_128_x8.d -MT obj/aes_cbc_enc_128_x8.o -o obj/aes_cbc_enc_128_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbc_enc_128_x8.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_sse.o.tmp obj/mb_mgr_hmac_sha_384_flush_sse.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_ni_sse.o 00:01:07.297 mv obj/mb_mgr_hmac_submit_ni_sse.o.tmp obj/mb_mgr_hmac_submit_ni_sse.o 00:01:07.297 mv obj/mb_mgr_hmac_sha_384_submit_sse.o.tmp obj/mb_mgr_hmac_sha_384_submit_sse.o 00:01:07.297 mv obj/mb_mgr_hmac_sha_224_submit_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_sse.o 00:01:07.297 nasm -MD obj/aes_cbc_enc_192_x8.d -MT obj/aes_cbc_enc_192_x8.o -o obj/aes_cbc_enc_192_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbc_enc_192_x8.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_sse.o 00:01:07.297 mv obj/mb_mgr_hmac_sha_384_flush_sse.o.tmp obj/mb_mgr_hmac_sha_384_flush_sse.o 00:01:07.297 mv obj/mb_mgr_hmac_sha_224_submit_ni_sse.o.tmp obj/mb_mgr_hmac_sha_224_submit_ni_sse.o 00:01:07.297 nasm -MD obj/aes_cbc_enc_256_x8.d -MT obj/aes_cbc_enc_256_x8.o -o obj/aes_cbc_enc_256_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbc_enc_256_x8.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/aes192_cntr_by8_sse.o.tmp obj/aes192_cntr_by8_sse.o 00:01:07.297 nasm -MD obj/aes128_cbc_dec_by8_avx.d -MT obj/aes128_cbc_dec_by8_avx.o -o obj/aes128_cbc_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cbc_dec_by8_avx.asm 00:01:07.297 mv obj/mb_mgr_hmac_sha_256_submit_sse.o.tmp obj/mb_mgr_hmac_sha_256_submit_sse.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o 00:01:07.297 nasm -MD obj/aes192_cbc_dec_by8_avx.d -MT obj/aes192_cbc_dec_by8_avx.o -o obj/aes192_cbc_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes192_cbc_dec_by8_avx.asm 00:01:07.297 mv obj/aes192_cntr_by8_sse.o.tmp obj/aes192_cntr_by8_sse.o 00:01:07.297 nasm -MD obj/aes256_cbc_dec_by8_avx.d -MT obj/aes256_cbc_dec_by8_avx.o -o obj/aes256_cbc_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cbc_dec_by8_avx.asm 00:01:07.297 nasm -MD obj/pon_avx.d -MT obj/pon_avx.o -o obj/pon_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/pon_avx.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o 00:01:07.297 mv obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o 00:01:07.297 nasm -MD obj/aes128_cntr_by8_avx.d -MT obj/aes128_cntr_by8_avx.o -o obj/aes128_cntr_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cntr_by8_avx.asm 00:01:07.297 nasm -MD obj/aes192_cntr_by8_avx.d -MT obj/aes192_cntr_by8_avx.o -o obj/aes192_cntr_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes192_cntr_by8_avx.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_sse.o.tmp obj/mb_mgr_hmac_sha_512_submit_sse.o 00:01:07.297 mv obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o 00:01:07.297 nasm -MD obj/aes256_cntr_by8_avx.d -MT obj/aes256_cntr_by8_avx.o -o obj/aes256_cntr_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cntr_by8_avx.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o 00:01:07.297 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o 00:01:07.297 mv obj/mb_mgr_hmac_sha_512_submit_sse.o.tmp obj/mb_mgr_hmac_sha_512_submit_sse.o 00:01:07.297 mv obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o 00:01:07.297 nasm -MD obj/aes128_cntr_ccm_by8_avx.d -MT obj/aes128_cntr_ccm_by8_avx.o -o obj/aes128_cntr_ccm_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cntr_ccm_by8_avx.asm 00:01:07.297 ld -r -z ibt -z shstk -o obj/pon_sse.o.tmp obj/pon_sse.o 00:01:07.297 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o 00:01:07.297 nasm -MD obj/aes256_cntr_ccm_by8_avx.d -MT obj/aes256_cntr_ccm_by8_avx.o -o obj/aes256_cntr_ccm_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cntr_ccm_by8_avx.asm 00:01:07.297 nasm -MD obj/aes_ecb_by4_avx.d -MT obj/aes_ecb_by4_avx.o -o obj/aes_ecb_by4_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_ecb_by4_avx.asm 00:01:07.297 nasm -MD obj/aes_cfb_avx.d -MT obj/aes_cfb_avx.o -o obj/aes_cfb_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cfb_avx.asm 00:01:07.298 mv obj/pon_sse.o.tmp obj/pon_sse.o 00:01:07.298 nasm -MD obj/aes128_cbc_mac_x8.d -MT obj/aes128_cbc_mac_x8.o -o obj/aes128_cbc_mac_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cbc_mac_x8.asm 00:01:07.298 nasm -MD obj/aes256_cbc_mac_x8.d -MT obj/aes256_cbc_mac_x8.o -o obj/aes256_cbc_mac_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes256_cbc_mac_x8.asm 00:01:07.298 nasm -MD obj/aes_xcbc_mac_128_x8.d -MT obj/aes_xcbc_mac_128_x8.o -o obj/aes_xcbc_mac_128_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_xcbc_mac_128_x8.asm 00:01:07.298 nasm -MD obj/md5_x4x2_avx.d -MT obj/md5_x4x2_avx.o -o obj/md5_x4x2_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/md5_x4x2_avx.asm 00:01:07.298 nasm -MD obj/sha1_mult_avx.d -MT obj/sha1_mult_avx.o -o obj/sha1_mult_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha1_mult_avx.asm 00:01:07.298 nasm -MD obj/sha1_one_block_avx.d -MT obj/sha1_one_block_avx.o -o obj/sha1_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha1_one_block_avx.asm 00:01:07.298 nasm -MD obj/sha224_one_block_avx.d -MT obj/sha224_one_block_avx.o -o obj/sha224_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha224_one_block_avx.asm 00:01:07.298 nasm -MD obj/sha256_one_block_avx.d -MT obj/sha256_one_block_avx.o -o obj/sha256_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha256_one_block_avx.asm 00:01:07.298 nasm -MD obj/sha_256_mult_avx.d -MT obj/sha_256_mult_avx.o -o obj/sha_256_mult_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha_256_mult_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x8.o.tmp obj/aes_cbc_enc_128_x8.o 00:01:07.298 nasm -MD obj/sha384_one_block_avx.d -MT obj/sha384_one_block_avx.o -o obj/sha384_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha384_one_block_avx.asm 00:01:07.298 nasm -MD obj/sha512_one_block_avx.d -MT obj/sha512_one_block_avx.o -o obj/sha512_one_block_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha512_one_block_avx.asm 00:01:07.298 nasm -MD obj/sha512_x2_avx.d -MT obj/sha512_x2_avx.o -o obj/sha512_x2_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/sha512_x2_avx.asm 00:01:07.298 nasm -MD obj/zuc_avx.d -MT obj/zuc_avx.o -o obj/zuc_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/zuc_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x8.o.tmp obj/aes_cbc_enc_256_x8.o 00:01:07.298 mv obj/aes_cbc_enc_128_x8.o.tmp obj/aes_cbc_enc_128_x8.o 00:01:07.298 nasm -MD obj/mb_mgr_aes_flush_avx.d -MT obj/mb_mgr_aes_flush_avx.o -o obj/mb_mgr_aes_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_flush_avx.asm 00:01:07.298 nasm -MD obj/mb_mgr_aes_submit_avx.d -MT obj/mb_mgr_aes_submit_avx.o -o obj/mb_mgr_aes_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_submit_avx.asm 00:01:07.298 nasm -MD obj/mb_mgr_aes192_flush_avx.d -MT obj/mb_mgr_aes192_flush_avx.o -o obj/mb_mgr_aes192_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes192_flush_avx.asm 00:01:07.298 mv obj/aes_cbc_enc_256_x8.o.tmp obj/aes_cbc_enc_256_x8.o 00:01:07.298 nasm -MD obj/mb_mgr_aes192_submit_avx.d -MT obj/mb_mgr_aes192_submit_avx.o -o obj/mb_mgr_aes192_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes192_submit_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x8.o.tmp obj/aes_cbc_enc_192_x8.o 00:01:07.298 nasm -MD obj/mb_mgr_aes256_flush_avx.d -MT obj/mb_mgr_aes256_flush_avx.o -o obj/mb_mgr_aes256_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_flush_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_cfb_avx.o.tmp obj/aes_cfb_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_aes256_submit_avx.d -MT obj/mb_mgr_aes256_submit_avx.o -o obj/mb_mgr_aes256_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_submit_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by8_avx.o.tmp obj/aes128_cbc_dec_by8_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by8_avx.o.tmp obj/aes192_cbc_dec_by8_avx.o 00:01:07.298 mv obj/aes_cbc_enc_192_x8.o.tmp obj/aes_cbc_enc_192_x8.o 00:01:07.298 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_avx.d -MT obj/mb_mgr_aes_cmac_submit_flush_avx.o -o obj/mb_mgr_aes_cmac_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_cmac_submit_flush_avx.asm 00:01:07.298 mv obj/aes_cfb_avx.o.tmp obj/aes_cfb_avx.o 00:01:07.298 mv obj/aes128_cbc_dec_by8_avx.o.tmp obj/aes128_cbc_dec_by8_avx.o 00:01:07.298 mv obj/aes192_cbc_dec_by8_avx.o.tmp obj/aes192_cbc_dec_by8_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/crc32_refl_by8_sse_no_aesni.o.tmp obj/crc32_refl_by8_sse_no_aesni.o 00:01:07.298 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_avx.d -MT obj/mb_mgr_aes256_cmac_submit_flush_avx.o -o obj/mb_mgr_aes256_cmac_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_cmac_submit_flush_avx.asm 00:01:07.298 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_avx.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_ccm_auth_submit_flush_avx.asm 00:01:07.298 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes256_ccm_auth_submit_flush_avx.asm 00:01:07.298 mv obj/crc32_refl_by8_sse_no_aesni.o.tmp obj/crc32_refl_by8_sse_no_aesni.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/sha224_one_block_avx.o.tmp obj/sha224_one_block_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/sha256_one_block_avx.o.tmp obj/sha256_one_block_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by8_avx.o.tmp obj/aes256_cbc_dec_by8_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_aes_xcbc_flush_avx.d -MT obj/mb_mgr_aes_xcbc_flush_avx.o -o obj/mb_mgr_aes_xcbc_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_xcbc_flush_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/sha384_one_block_avx.o.tmp obj/sha384_one_block_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/sha1_one_block_avx.o.tmp obj/sha1_one_block_avx.o 00:01:07.298 mv obj/sha224_one_block_avx.o.tmp obj/sha224_one_block_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x8.o.tmp obj/aes256_cbc_mac_x8.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/sha512_one_block_avx.o.tmp obj/sha512_one_block_avx.o 00:01:07.298 mv obj/sha256_one_block_avx.o.tmp obj/sha256_one_block_avx.o 00:01:07.298 mv obj/aes256_cbc_dec_by8_avx.o.tmp obj/aes256_cbc_dec_by8_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_aes_xcbc_submit_avx.d -MT obj/mb_mgr_aes_xcbc_submit_avx.o -o obj/mb_mgr_aes_xcbc_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes_xcbc_submit_avx.asm 00:01:07.298 mv obj/sha384_one_block_avx.o.tmp obj/sha384_one_block_avx.o 00:01:07.298 mv obj/sha1_one_block_avx.o.tmp obj/sha1_one_block_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_md5_flush_avx.d -MT obj/mb_mgr_hmac_md5_flush_avx.o -o obj/mb_mgr_hmac_md5_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_md5_flush_avx.asm 00:01:07.298 mv obj/aes256_cbc_mac_x8.o.tmp obj/aes256_cbc_mac_x8.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/md5_x4x2_sse.o.tmp obj/md5_x4x2_sse.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_ecb_by4_avx.o.tmp obj/aes_ecb_by4_avx.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x8.o.tmp obj/aes128_cbc_mac_x8.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_xcbc_mac_128_x8.o.tmp obj/aes_xcbc_mac_128_x8.o 00:01:07.298 mv obj/sha512_one_block_avx.o.tmp obj/sha512_one_block_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_md5_submit_avx.d -MT obj/mb_mgr_hmac_md5_submit_avx.o -o obj/mb_mgr_hmac_md5_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_md5_submit_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o 00:01:07.298 mv obj/md5_x4x2_sse.o.tmp obj/md5_x4x2_sse.o 00:01:07.298 mv obj/aes_ecb_by4_avx.o.tmp obj/aes_ecb_by4_avx.o 00:01:07.298 mv obj/aes128_cbc_mac_x8.o.tmp obj/aes128_cbc_mac_x8.o 00:01:07.298 mv obj/aes_xcbc_mac_128_x8.o.tmp obj/aes_xcbc_mac_128_x8.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_flush_avx.d -MT obj/mb_mgr_hmac_flush_avx.o -o obj/mb_mgr_hmac_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_flush_avx.asm 00:01:07.298 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_128_x4_no_aesni.o.tmp obj/aes_cbc_enc_128_x4_no_aesni.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_submit_avx.d -MT obj/mb_mgr_hmac_submit_avx.o -o obj/mb_mgr_hmac_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_submit_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_avx.o.tmp obj/mb_mgr_aes192_flush_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_224_flush_avx.d -MT obj/mb_mgr_hmac_sha_224_flush_avx.o -o obj/mb_mgr_hmac_sha_224_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_224_flush_avx.asm 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_224_submit_avx.d -MT obj/mb_mgr_hmac_sha_224_submit_avx.o -o obj/mb_mgr_hmac_sha_224_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_224_submit_avx.asm 00:01:07.298 mv obj/aes_cbc_enc_128_x4_no_aesni.o.tmp obj/aes_cbc_enc_128_x4_no_aesni.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_gfni_sse.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_sse.o 00:01:07.298 mv obj/mb_mgr_aes192_flush_avx.o.tmp obj/mb_mgr_aes192_flush_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_256_flush_avx.d -MT obj/mb_mgr_hmac_sha_256_flush_avx.o -o obj/mb_mgr_hmac_sha_256_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_256_flush_avx.asm 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_256_submit_avx.d -MT obj/mb_mgr_hmac_sha_256_submit_avx.o -o obj/mb_mgr_hmac_sha_256_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_256_submit_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_avx.o.tmp obj/mb_mgr_aes_flush_avx.o 00:01:07.298 mv obj/mb_mgr_zuc_submit_flush_gfni_sse.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_sse.o 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_avx.o.tmp obj/mb_mgr_aes_submit_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_384_flush_avx.d -MT obj/mb_mgr_hmac_sha_384_flush_avx.o -o obj/mb_mgr_hmac_sha_384_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_384_flush_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/crc32_by8_sse_no_aesni.o.tmp obj/crc32_by8_sse_no_aesni.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_384_submit_avx.d -MT obj/mb_mgr_hmac_sha_384_submit_avx.o -o obj/mb_mgr_hmac_sha_384_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_384_submit_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_avx.o.tmp obj/mb_mgr_aes192_submit_avx.o 00:01:07.298 mv obj/mb_mgr_aes_flush_avx.o.tmp obj/mb_mgr_aes_flush_avx.o 00:01:07.298 mv obj/mb_mgr_aes_submit_avx.o.tmp obj/mb_mgr_aes_submit_avx.o 00:01:07.298 nasm -MD obj/mb_mgr_hmac_sha_512_flush_avx.d -MT obj/mb_mgr_hmac_sha_512_flush_avx.o -o obj/mb_mgr_hmac_sha_512_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_512_flush_avx.asm 00:01:07.298 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_avx.o.tmp obj/mb_mgr_aes256_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_avx.o.tmp obj/mb_mgr_aes256_submit_avx.o 00:01:07.299 mv obj/crc32_by8_sse_no_aesni.o.tmp obj/crc32_by8_sse_no_aesni.o 00:01:07.299 mv obj/mb_mgr_aes192_submit_avx.o.tmp obj/mb_mgr_aes192_submit_avx.o 00:01:07.299 nasm -MD obj/mb_mgr_hmac_sha_512_submit_avx.d -MT obj/mb_mgr_hmac_sha_512_submit_avx.o -o obj/mb_mgr_hmac_sha_512_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_hmac_sha_512_submit_avx.asm 00:01:07.299 nasm -MD obj/mb_mgr_zuc_submit_flush_avx.d -MT obj/mb_mgr_zuc_submit_flush_avx.o -o obj/mb_mgr_zuc_submit_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_zuc_submit_flush_avx.asm 00:01:07.299 mv obj/mb_mgr_aes256_flush_avx.o.tmp obj/mb_mgr_aes256_flush_avx.o 00:01:07.299 mv obj/mb_mgr_aes256_submit_avx.o.tmp obj/mb_mgr_aes256_submit_avx.o 00:01:07.299 nasm -MD obj/ethernet_fcs_avx.d -MT obj/ethernet_fcs_avx.o -o obj/ethernet_fcs_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/ethernet_fcs_avx.asm 00:01:07.299 nasm -MD obj/crc16_x25_avx.d -MT obj/crc16_x25_avx.o -o obj/crc16_x25_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc16_x25_avx.asm 00:01:07.299 nasm -MD obj/aes_cbcs_1_9_enc_128_x8.d -MT obj/aes_cbcs_1_9_enc_128_x8.o -o obj/aes_cbcs_1_9_enc_128_x8.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes_cbcs_1_9_enc_128_x8.asm 00:01:07.299 nasm -MD obj/aes128_cbcs_1_9_dec_by8_avx.d -MT obj/aes128_cbcs_1_9_dec_by8_avx.o -o obj/aes128_cbcs_1_9_dec_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/aes128_cbcs_1_9_dec_by8_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/sha1_mult_sse.o.tmp obj/sha1_mult_sse.o 00:01:07.299 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_submit_avx.d -MT obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o -o obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes128_cbcs_1_9_submit_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/sha_256_mult_avx.o.tmp obj/sha_256_mult_avx.o 00:01:07.299 nasm -MD obj/mb_mgr_aes128_cbcs_1_9_flush_avx.d -MT obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o -o obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/mb_mgr_aes128_cbcs_1_9_flush_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes256_cntr_ccm_by8_avx.o.tmp obj/aes256_cntr_ccm_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_flush_avx.o.tmp obj/mb_mgr_aes_xcbc_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/ethernet_fcs_avx.o.tmp obj/ethernet_fcs_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc16_x25_avx.o.tmp obj/crc16_x25_avx.o 00:01:07.299 mv obj/sha1_mult_sse.o.tmp obj/sha1_mult_sse.o 00:01:07.299 mv obj/sha_256_mult_avx.o.tmp obj/sha_256_mult_avx.o 00:01:07.299 nasm -MD obj/crc32_refl_by8_avx.d -MT obj/crc32_refl_by8_avx.o -o obj/crc32_refl_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_refl_by8_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes128_cbc_mac_x4_no_aesni.o.tmp obj/aes128_cbc_mac_x4_no_aesni.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes128_cntr_ccm_by8_avx.o.tmp obj/aes128_cntr_ccm_by8_avx.o 00:01:07.299 mv obj/aes256_cntr_ccm_by8_avx.o.tmp obj/aes256_cntr_ccm_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_sse.o.tmp obj/mb_mgr_zuc_submit_flush_sse.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes128_cntr_by8_avx.o.tmp obj/aes128_cntr_by8_avx.o 00:01:07.299 mv obj/mb_mgr_aes_xcbc_flush_avx.o.tmp obj/mb_mgr_aes_xcbc_flush_avx.o 00:01:07.299 mv obj/ethernet_fcs_avx.o.tmp obj/ethernet_fcs_avx.o 00:01:07.299 mv obj/crc16_x25_avx.o.tmp obj/crc16_x25_avx.o 00:01:07.299 mv obj/aes128_cbc_mac_x4_no_aesni.o.tmp obj/aes128_cbc_mac_x4_no_aesni.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_flush_avx.o.tmp obj/mb_mgr_hmac_md5_flush_avx.o 00:01:07.299 mv obj/aes128_cntr_ccm_by8_avx.o.tmp obj/aes128_cntr_ccm_by8_avx.o 00:01:07.299 mv obj/mb_mgr_zuc_submit_flush_sse.o.tmp obj/mb_mgr_zuc_submit_flush_sse.o 00:01:07.299 mv obj/aes128_cntr_by8_avx.o.tmp obj/aes128_cntr_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_avx.o.tmp obj/mb_mgr_aes_xcbc_submit_avx.o 00:01:07.299 nasm -MD obj/crc32_by8_avx.d -MT obj/crc32_by8_avx.o -o obj/crc32_by8_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_by8_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_avx.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_avx.o.tmp obj/mb_mgr_hmac_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_md5_flush_avx.o.tmp obj/mb_mgr_hmac_md5_flush_avx.o 00:01:07.299 mv obj/mb_mgr_aes_xcbc_submit_avx.o.tmp obj/mb_mgr_aes_xcbc_submit_avx.o 00:01:07.299 nasm -MD obj/crc32_sctp_avx.d -MT obj/crc32_sctp_avx.o -o obj/crc32_sctp_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_sctp_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/sha512_x2_avx.o.tmp obj/sha512_x2_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_submit_avx.o.tmp obj/mb_mgr_hmac_md5_submit_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_256_flush_avx.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/sha1_mult_avx.o.tmp obj/sha1_mult_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_avx.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_flush_avx.o.tmp obj/mb_mgr_hmac_flush_avx.o 00:01:07.299 nasm -MD obj/crc32_lte_avx.d -MT obj/crc32_lte_avx.o -o obj/crc32_lte_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_lte_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_avx.o.tmp obj/mb_mgr_hmac_submit_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_refl_by8_avx.o.tmp obj/crc32_refl_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_sctp_avx.o.tmp obj/crc32_sctp_avx.o 00:01:07.299 mv obj/sha512_x2_avx.o.tmp obj/sha512_x2_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_md5_submit_avx.o.tmp obj/mb_mgr_hmac_md5_submit_avx.o 00:01:07.299 mv obj/sha1_mult_avx.o.tmp obj/sha1_mult_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_224_flush_avx.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_submit_avx.o.tmp obj/mb_mgr_hmac_submit_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes128_cbcs_1_9_dec_by8_avx.o.tmp obj/aes128_cbcs_1_9_dec_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_avx.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes_cbcs_1_9_enc_128_x8.o.tmp obj/aes_cbcs_1_9_enc_128_x8.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_by8_avx.o.tmp obj/crc32_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_avx.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_lte_avx.o.tmp obj/crc32_lte_avx.o 00:01:07.299 mv obj/crc32_refl_by8_avx.o.tmp obj/crc32_refl_by8_avx.o 00:01:07.299 mv obj/mb_mgr_aes256_cmac_submit_flush_sse.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_sse.o 00:01:07.299 mv obj/crc32_sctp_avx.o.tmp obj/crc32_sctp_avx.o 00:01:07.299 mv obj/aes128_cbcs_1_9_dec_by8_avx.o.tmp obj/aes128_cbcs_1_9_dec_by8_avx.o 00:01:07.299 mv obj/mb_mgr_aes256_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_224_submit_avx.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx.o 00:01:07.299 mv obj/aes_cbcs_1_9_enc_128_x8.o.tmp obj/aes_cbcs_1_9_enc_128_x8.o 00:01:07.299 mv obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o 00:01:07.299 mv obj/crc32_by8_avx.o.tmp obj/crc32_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes_cmac_submit_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_avx.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_256_submit_avx.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx.o 00:01:07.299 mv obj/crc32_lte_avx.o.tmp obj/crc32_lte_avx.o 00:01:07.299 nasm -MD obj/crc32_fp_avx.d -MT obj/crc32_fp_avx.o -o obj/crc32_fp_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_fp_avx.asm 00:01:07.299 mv obj/mb_mgr_aes_cmac_submit_flush_avx.o.tmp obj/mb_mgr_aes_cmac_submit_flush_avx.o 00:01:07.299 nasm -MD obj/crc32_iuup_avx.d -MT obj/crc32_iuup_avx.o -o obj/crc32_iuup_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_iuup_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/aes192_cntr_by8_avx.o.tmp obj/aes192_cntr_by8_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_avx.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_512_flush_avx.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx.o 00:01:07.299 nasm -MD obj/crc32_wimax_avx.d -MT obj/crc32_wimax_avx.o -o obj/crc32_wimax_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/crc32_wimax_avx.asm 00:01:07.299 mv obj/aes192_cntr_by8_avx.o.tmp obj/aes192_cntr_by8_avx.o 00:01:07.299 mv obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_384_flush_avx.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx.o 00:01:07.299 mv obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o.tmp obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o 00:01:07.299 nasm -MD obj/chacha20_avx.d -MT obj/chacha20_avx.o -o obj/chacha20_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/chacha20_avx.asm 00:01:07.299 nasm -MD obj/memcpy_avx.d -MT obj/memcpy_avx.o -o obj/memcpy_avx.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/memcpy_avx.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_fp_avx.o.tmp obj/crc32_fp_avx.o 00:01:07.299 nasm -MD obj/gcm128_avx_gen2.d -MT obj/gcm128_avx_gen2.o -o obj/gcm128_avx_gen2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/gcm128_avx_gen2.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_avx.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx.o 00:01:07.299 nasm -MD obj/gcm192_avx_gen2.d -MT obj/gcm192_avx_gen2.o -o obj/gcm192_avx_gen2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/gcm192_avx_gen2.asm 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_iuup_avx.o.tmp obj/crc32_iuup_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/crc32_wimax_avx.o.tmp obj/crc32_wimax_avx.o 00:01:07.299 mv obj/crc32_fp_avx.o.tmp obj/crc32_fp_avx.o 00:01:07.299 nasm -MD obj/gcm256_avx_gen2.d -MT obj/gcm256_avx_gen2.o -o obj/gcm256_avx_gen2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx/gcm256_avx_gen2.asm 00:01:07.299 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o 00:01:07.299 mv obj/mb_mgr_hmac_sha_384_submit_avx.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx.o 00:01:07.299 mv obj/crc32_iuup_avx.o.tmp obj/crc32_iuup_avx.o 00:01:07.299 mv obj/crc32_wimax_avx.o.tmp obj/crc32_wimax_avx.o 00:01:07.299 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_avx.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx.o 00:01:07.299 nasm -MD obj/md5_x8x2_avx2.d -MT obj/md5_x8x2_avx2.o -o obj/md5_x8x2_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/md5_x8x2_avx2.asm 00:01:07.299 nasm -MD obj/sha1_x8_avx2.d -MT obj/sha1_x8_avx2.o -o obj/sha1_x8_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/sha1_x8_avx2.asm 00:01:07.300 nasm -MD obj/sha256_oct_avx2.d -MT obj/sha256_oct_avx2.o -o obj/sha256_oct_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/sha256_oct_avx2.asm 00:01:07.300 mv obj/mb_mgr_hmac_sha_512_submit_avx.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx.o 00:01:07.300 nasm -MD obj/sha512_x4_avx2.d -MT obj/sha512_x4_avx2.o -o obj/sha512_x4_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/sha512_x4_avx2.asm 00:01:07.300 nasm -MD obj/zuc_avx2.d -MT obj/zuc_avx2.o -o obj/zuc_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/zuc_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_md5_flush_avx2.d -MT obj/mb_mgr_hmac_md5_flush_avx2.o -o obj/mb_mgr_hmac_md5_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_md5_flush_avx2.asm 00:01:07.300 ld -r -z ibt -z shstk -o obj/memcpy_avx.o.tmp obj/memcpy_avx.o 00:01:07.300 nasm -MD obj/mb_mgr_hmac_md5_submit_avx2.d -MT obj/mb_mgr_hmac_md5_submit_avx2.o -o obj/mb_mgr_hmac_md5_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_md5_submit_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_flush_avx2.d -MT obj/mb_mgr_hmac_flush_avx2.o -o obj/mb_mgr_hmac_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_flush_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_submit_avx2.d -MT obj/mb_mgr_hmac_submit_avx2.o -o obj/mb_mgr_hmac_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_submit_avx2.asm 00:01:07.300 mv obj/memcpy_avx.o.tmp obj/memcpy_avx.o 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_224_flush_avx2.d -MT obj/mb_mgr_hmac_sha_224_flush_avx2.o -o obj/mb_mgr_hmac_sha_224_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_224_flush_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_224_submit_avx2.d -MT obj/mb_mgr_hmac_sha_224_submit_avx2.o -o obj/mb_mgr_hmac_sha_224_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_224_submit_avx2.asm 00:01:07.300 ld -r -z ibt -z shstk -o obj/aes256_cntr_by8_avx.o.tmp obj/aes256_cntr_by8_avx.o 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_256_flush_avx2.d -MT obj/mb_mgr_hmac_sha_256_flush_avx2.o -o obj/mb_mgr_hmac_sha_256_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_256_flush_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_256_submit_avx2.d -MT obj/mb_mgr_hmac_sha_256_submit_avx2.o -o obj/mb_mgr_hmac_sha_256_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_256_submit_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_384_flush_avx2.d -MT obj/mb_mgr_hmac_sha_384_flush_avx2.o -o obj/mb_mgr_hmac_sha_384_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_384_flush_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_384_submit_avx2.d -MT obj/mb_mgr_hmac_sha_384_submit_avx2.o -o obj/mb_mgr_hmac_sha_384_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_384_submit_avx2.asm 00:01:07.300 mv obj/aes256_cntr_by8_avx.o.tmp obj/aes256_cntr_by8_avx.o 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_512_flush_avx2.d -MT obj/mb_mgr_hmac_sha_512_flush_avx2.o -o obj/mb_mgr_hmac_sha_512_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_512_flush_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_hmac_sha_512_submit_avx2.d -MT obj/mb_mgr_hmac_sha_512_submit_avx2.o -o obj/mb_mgr_hmac_sha_512_submit_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_hmac_sha_512_submit_avx2.asm 00:01:07.300 nasm -MD obj/mb_mgr_zuc_submit_flush_avx2.d -MT obj/mb_mgr_zuc_submit_flush_avx2.o -o obj/mb_mgr_zuc_submit_flush_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/mb_mgr_zuc_submit_flush_avx2.asm 00:01:07.300 nasm -MD obj/chacha20_avx2.d -MT obj/chacha20_avx2.o -o obj/chacha20_avx2.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/chacha20_avx2.asm 00:01:07.300 nasm -MD obj/gcm128_avx_gen4.d -MT obj/gcm128_avx_gen4.o -o obj/gcm128_avx_gen4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/gcm128_avx_gen4.asm 00:01:07.300 nasm -MD obj/gcm192_avx_gen4.d -MT obj/gcm192_avx_gen4.o -o obj/gcm192_avx_gen4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/gcm192_avx_gen4.asm 00:01:07.300 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_avx.o.tmp obj/mb_mgr_zuc_submit_flush_avx.o 00:01:07.300 nasm -MD obj/gcm256_avx_gen4.d -MT obj/gcm256_avx_gen4.o -o obj/gcm256_avx_gen4.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx2/gcm256_avx_gen4.asm 00:01:07.300 nasm -MD obj/sha1_x16_avx512.d -MT obj/sha1_x16_avx512.o -o obj/sha1_x16_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/sha1_x16_avx512.asm 00:01:07.300 nasm -MD obj/sha256_x16_avx512.d -MT obj/sha256_x16_avx512.o -o obj/sha256_x16_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/sha256_x16_avx512.asm 00:01:07.300 nasm -MD obj/sha512_x8_avx512.d -MT obj/sha512_x8_avx512.o -o obj/sha512_x8_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/sha512_x8_avx512.asm 00:01:07.300 mv obj/mb_mgr_zuc_submit_flush_avx.o.tmp obj/mb_mgr_zuc_submit_flush_avx.o 00:01:07.300 nasm -MD obj/des_x16_avx512.d -MT obj/des_x16_avx512.o -o obj/des_x16_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/des_x16_avx512.asm 00:01:07.300 nasm -MD obj/cntr_vaes_avx512.d -MT obj/cntr_vaes_avx512.o -o obj/cntr_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/cntr_vaes_avx512.asm 00:01:07.300 nasm -MD obj/cntr_ccm_vaes_avx512.d -MT obj/cntr_ccm_vaes_avx512.o -o obj/cntr_ccm_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/cntr_ccm_vaes_avx512.asm 00:01:07.300 nasm -MD obj/aes_cbc_dec_vaes_avx512.d -MT obj/aes_cbc_dec_vaes_avx512.o -o obj/aes_cbc_dec_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbc_dec_vaes_avx512.asm 00:01:07.300 nasm -MD obj/aes_cbc_enc_vaes_avx512.d -MT obj/aes_cbc_enc_vaes_avx512.o -o obj/aes_cbc_enc_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbc_enc_vaes_avx512.asm 00:01:07.300 nasm -MD obj/aes_cbcs_enc_vaes_avx512.d -MT obj/aes_cbcs_enc_vaes_avx512.o -o obj/aes_cbcs_enc_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbcs_enc_vaes_avx512.asm 00:01:07.300 nasm -MD obj/aes_cbcs_dec_vaes_avx512.d -MT obj/aes_cbcs_dec_vaes_avx512.o -o obj/aes_cbcs_dec_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_cbcs_dec_vaes_avx512.asm 00:01:07.300 nasm -MD obj/aes_docsis_dec_avx512.d -MT obj/aes_docsis_dec_avx512.o -o obj/aes_docsis_dec_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_dec_avx512.asm 00:01:07.300 nasm -MD obj/aes_docsis_enc_avx512.d -MT obj/aes_docsis_enc_avx512.o -o obj/aes_docsis_enc_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_enc_avx512.asm 00:01:07.300 nasm -MD obj/aes_docsis_dec_vaes_avx512.d -MT obj/aes_docsis_dec_vaes_avx512.o -o obj/aes_docsis_dec_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_dec_vaes_avx512.asm 00:01:07.300 nasm -MD obj/aes_docsis_enc_vaes_avx512.d -MT obj/aes_docsis_enc_vaes_avx512.o -o obj/aes_docsis_enc_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/aes_docsis_enc_vaes_avx512.asm 00:01:07.300 ld -r -z ibt -z shstk -o obj/aes128_cntr_by8_sse.o.tmp obj/aes128_cntr_by8_sse.o 00:01:07.300 mv obj/aes128_cntr_by8_sse.o.tmp obj/aes128_cntr_by8_sse.o 00:01:07.300 nasm -MD obj/zuc_avx512.d -MT obj/zuc_avx512.o -o obj/zuc_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/zuc_avx512.asm 00:01:07.300 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_avx2.o.tmp obj/mb_mgr_hmac_flush_avx2.o 00:01:07.300 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx2.o 00:01:07.300 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx2.o 00:01:07.300 mv obj/mb_mgr_hmac_flush_avx2.o.tmp obj/mb_mgr_hmac_flush_avx2.o 00:01:07.300 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx2.o 00:01:07.300 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx2.o 00:01:07.300 mv obj/mb_mgr_hmac_sha_224_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx2.o 00:01:07.300 mv obj/mb_mgr_hmac_sha_384_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/sha256_oct_avx2.o.tmp obj/sha256_oct_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/md5_x4x2_avx.o.tmp obj/md5_x4x2_avx.o 00:01:07.566 mv obj/mb_mgr_hmac_sha_256_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx2.o 00:01:07.566 nasm -MD obj/mb_mgr_aes_submit_avx512.d -MT obj/mb_mgr_aes_submit_avx512.o -o obj/mb_mgr_aes_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_submit_avx512.asm 00:01:07.566 mv obj/mb_mgr_hmac_sha_512_flush_avx2.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx2.o 00:01:07.566 mv obj/sha256_oct_avx2.o.tmp obj/sha256_oct_avx2.o 00:01:07.566 nasm -MD obj/mb_mgr_aes_flush_avx512.d -MT obj/mb_mgr_aes_flush_avx512.o -o obj/mb_mgr_aes_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_flush_avx512.asm 00:01:07.566 mv obj/md5_x4x2_avx.o.tmp obj/md5_x4x2_avx.o 00:01:07.566 nasm -MD obj/mb_mgr_aes192_submit_avx512.d -MT obj/mb_mgr_aes192_submit_avx512.o -o obj/mb_mgr_aes192_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes192_submit_avx512.asm 00:01:07.566 nasm -MD obj/mb_mgr_aes192_flush_avx512.d -MT obj/mb_mgr_aes192_flush_avx512.o -o obj/mb_mgr_aes192_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes192_flush_avx512.asm 00:01:07.566 nasm -MD obj/mb_mgr_aes256_submit_avx512.d -MT obj/mb_mgr_aes256_submit_avx512.o -o obj/mb_mgr_aes256_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_submit_avx512.asm 00:01:07.566 nasm -MD obj/mb_mgr_aes256_flush_avx512.d -MT obj/mb_mgr_aes256_flush_avx512.o -o obj/mb_mgr_aes256_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_flush_avx512.asm 00:01:07.566 nasm -MD obj/mb_mgr_hmac_flush_avx512.d -MT obj/mb_mgr_hmac_flush_avx512.o -o obj/mb_mgr_hmac_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_flush_avx512.asm 00:01:07.566 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_avx2.o.tmp obj/mb_mgr_hmac_submit_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/aes_cbcs_enc_vaes_avx512.o.tmp obj/aes_cbcs_enc_vaes_avx512.o 00:01:07.566 mv obj/mb_mgr_hmac_submit_avx2.o.tmp obj/mb_mgr_hmac_submit_avx2.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx2.o 00:01:07.566 mv obj/mb_mgr_hmac_sha_224_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx2.o 00:01:07.566 mv obj/mb_mgr_hmac_sha_256_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx2.o 00:01:07.566 mv obj/mb_mgr_hmac_sha_384_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx2.o 00:01:07.566 mv obj/aes_cbcs_enc_vaes_avx512.o.tmp obj/aes_cbcs_enc_vaes_avx512.o 00:01:07.566 mv obj/mb_mgr_hmac_sha_512_submit_avx2.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx2.o 00:01:07.566 nasm -MD obj/mb_mgr_hmac_submit_avx512.d -MT obj/mb_mgr_hmac_submit_avx512.o -o obj/mb_mgr_hmac_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_submit_avx512.asm 00:01:07.566 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_192_x4_no_aesni.o.tmp obj/aes_cbc_enc_192_x4_no_aesni.o 00:01:07.566 ld -r -z ibt -z shstk -o obj/aes256_cntr_by8_sse.o.tmp obj/aes256_cntr_by8_sse.o 00:01:07.566 nasm -MD obj/mb_mgr_hmac_sha_224_flush_avx512.d -MT obj/mb_mgr_hmac_sha_224_flush_avx512.o -o obj/mb_mgr_hmac_sha_224_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_224_flush_avx512.asm 00:01:07.566 nasm -MD obj/mb_mgr_hmac_sha_224_submit_avx512.d -MT obj/mb_mgr_hmac_sha_224_submit_avx512.o -o obj/mb_mgr_hmac_sha_224_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_224_submit_avx512.asm 00:01:07.567 nasm -MD obj/mb_mgr_hmac_sha_256_flush_avx512.d -MT obj/mb_mgr_hmac_sha_256_flush_avx512.o -o obj/mb_mgr_hmac_sha_256_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_256_flush_avx512.asm 00:01:07.567 mv obj/aes_cbc_enc_192_x4_no_aesni.o.tmp obj/aes_cbc_enc_192_x4_no_aesni.o 00:01:07.567 mv obj/aes256_cntr_by8_sse.o.tmp obj/aes256_cntr_by8_sse.o 00:01:07.567 nasm -MD obj/mb_mgr_hmac_sha_256_submit_avx512.d -MT obj/mb_mgr_hmac_sha_256_submit_avx512.o -o obj/mb_mgr_hmac_sha_256_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_256_submit_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/sha1_x8_avx2.o.tmp obj/sha1_x8_avx2.o 00:01:07.567 nasm -MD obj/mb_mgr_hmac_sha_384_flush_avx512.d -MT obj/mb_mgr_hmac_sha_384_flush_avx512.o -o obj/mb_mgr_hmac_sha_384_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_384_flush_avx512.asm 00:01:07.567 nasm -MD obj/mb_mgr_hmac_sha_384_submit_avx512.d -MT obj/mb_mgr_hmac_sha_384_submit_avx512.o -o obj/mb_mgr_hmac_sha_384_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_384_submit_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/sha512_x4_avx2.o.tmp obj/sha512_x4_avx2.o 00:01:07.567 nasm -MD obj/mb_mgr_hmac_sha_512_flush_avx512.d -MT obj/mb_mgr_hmac_sha_512_flush_avx512.o -o obj/mb_mgr_hmac_sha_512_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_512_flush_avx512.asm 00:01:07.567 mv obj/sha1_x8_avx2.o.tmp obj/sha1_x8_avx2.o 00:01:07.567 mv obj/sha512_x4_avx2.o.tmp obj/sha512_x4_avx2.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_256_x4_no_aesni.o.tmp obj/aes_cbc_enc_256_x4_no_aesni.o 00:01:07.567 nasm -MD obj/mb_mgr_hmac_sha_512_submit_avx512.d -MT obj/mb_mgr_hmac_sha_512_submit_avx512.o -o obj/mb_mgr_hmac_sha_512_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_hmac_sha_512_submit_avx512.asm 00:01:07.567 nasm -MD obj/mb_mgr_des_avx512.d -MT obj/mb_mgr_des_avx512.o -o obj/mb_mgr_des_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_des_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/sha256_x16_avx512.o.tmp obj/sha256_x16_avx512.o 00:01:07.567 mv obj/aes_cbc_enc_256_x4_no_aesni.o.tmp obj/aes_cbc_enc_256_x4_no_aesni.o 00:01:07.567 mv obj/sha256_x16_avx512.o.tmp obj/sha256_x16_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_submit_avx512.o.tmp obj/mb_mgr_aes_submit_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_cmac_submit_flush_vaes_avx512.asm 00:01:07.567 mv obj/mb_mgr_aes_submit_avx512.o.tmp obj/mb_mgr_aes_submit_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_submit_avx512.o.tmp obj/mb_mgr_aes192_submit_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/sha1_x16_avx512.o.tmp obj/sha1_x16_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_submit_avx512.o.tmp obj/mb_mgr_aes256_submit_avx512.o 00:01:07.567 mv obj/mb_mgr_aes192_submit_avx512.o.tmp obj/mb_mgr_aes192_submit_avx512.o 00:01:07.567 mv obj/sha1_x16_avx512.o.tmp obj/sha1_x16_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.asm 00:01:07.567 mv obj/mb_mgr_aes256_submit_avx512.o.tmp obj/mb_mgr_aes256_submit_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/sha512_x8_avx512.o.tmp obj/sha512_x8_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.d -MT obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o -o obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_flush_avx2.o.tmp obj/mb_mgr_hmac_md5_flush_avx2.o 00:01:07.567 mv obj/sha512_x8_avx512.o.tmp obj/sha512_x8_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_zuc_submit_flush_avx512.d -MT obj/mb_mgr_zuc_submit_flush_avx512.o -o obj/mb_mgr_zuc_submit_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_zuc_submit_flush_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/aes_xcbc_mac_128_x4_no_aesni.o.tmp obj/aes_xcbc_mac_128_x4_no_aesni.o 00:01:07.567 nasm -MD obj/mb_mgr_zuc_submit_flush_gfni_avx512.d -MT obj/mb_mgr_zuc_submit_flush_gfni_avx512.o -o obj/mb_mgr_zuc_submit_flush_gfni_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_zuc_submit_flush_gfni_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_flush_avx512.o.tmp obj/mb_mgr_hmac_flush_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_md5_flush_avx2.o.tmp obj/mb_mgr_hmac_md5_flush_avx2.o 00:01:07.567 mv obj/aes_xcbc_mac_128_x4_no_aesni.o.tmp obj/aes_xcbc_mac_128_x4_no_aesni.o 00:01:07.567 mv obj/mb_mgr_hmac_flush_avx512.o.tmp obj/mb_mgr_hmac_flush_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx512.o 00:01:07.567 nasm -MD obj/chacha20_avx512.d -MT obj/chacha20_avx512.o -o obj/chacha20_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/chacha20_avx512.asm 00:01:07.567 nasm -MD obj/poly_avx512.d -MT obj/poly_avx512.o -o obj/poly_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/poly_avx512.asm 00:01:07.567 mv obj/mb_mgr_hmac_sha_224_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_224_flush_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_md5_submit_avx2.o.tmp obj/mb_mgr_hmac_md5_submit_avx2.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/pon_avx.o.tmp obj/pon_avx.o 00:01:07.567 nasm -MD obj/poly_fma_avx512.d -MT obj/poly_fma_avx512.o -o obj/poly_fma_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/poly_fma_avx512.asm 00:01:07.567 mv obj/mb_mgr_hmac_md5_submit_avx2.o.tmp obj/mb_mgr_hmac_md5_submit_avx2.o 00:01:07.567 nasm -MD obj/ethernet_fcs_avx512.d -MT obj/ethernet_fcs_avx512.o -o obj/ethernet_fcs_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/ethernet_fcs_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx512.o 00:01:07.567 mv obj/pon_avx.o.tmp obj/pon_avx.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_submit_avx512.o.tmp obj/mb_mgr_hmac_submit_avx512.o 00:01:07.567 nasm -MD obj/crc16_x25_avx512.d -MT obj/crc16_x25_avx512.o -o obj/crc16_x25_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc16_x25_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_sha_384_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_384_flush_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_submit_avx512.o.tmp obj/mb_mgr_hmac_submit_avx512.o 00:01:07.567 nasm -MD obj/crc32_refl_by16_vclmul_avx512.d -MT obj/crc32_refl_by16_vclmul_avx512.o -o obj/crc32_refl_by16_vclmul_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_refl_by16_vclmul_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_flush_avx512.o.tmp obj/mb_mgr_aes_flush_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_aes192_flush_avx512.o.tmp obj/mb_mgr_aes192_flush_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_384_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/crc16_x25_avx512.o.tmp obj/crc16_x25_avx512.o 00:01:07.567 ld -r -z ibt -z shstk -o obj/ethernet_fcs_avx512.o.tmp obj/ethernet_fcs_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_sha_256_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_256_flush_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_sha_512_flush_avx512.o.tmp obj/mb_mgr_hmac_sha_512_flush_avx512.o 00:01:07.567 nasm -MD obj/crc32_by16_vclmul_avx512.d -MT obj/crc32_by16_vclmul_avx512.o -o obj/crc32_by16_vclmul_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_by16_vclmul_avx512.asm 00:01:07.567 mv obj/mb_mgr_aes_flush_avx512.o.tmp obj/mb_mgr_aes_flush_avx512.o 00:01:07.567 mv obj/mb_mgr_aes192_flush_avx512.o.tmp obj/mb_mgr_aes192_flush_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_sha_384_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_384_submit_avx512.o 00:01:07.567 mv obj/crc16_x25_avx512.o.tmp obj/crc16_x25_avx512.o 00:01:07.567 mv obj/ethernet_fcs_avx512.o.tmp obj/ethernet_fcs_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes_cbcs_1_9_submit_avx512.d -MT obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o -o obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_cbcs_1_9_submit_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_224_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx512.o 00:01:07.567 nasm -MD obj/mb_mgr_aes_cbcs_1_9_flush_avx512.d -MT obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o -o obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/mb_mgr_aes_cbcs_1_9_flush_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_512_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx512.o 00:01:07.567 nasm -MD obj/crc32_sctp_avx512.d -MT obj/crc32_sctp_avx512.o -o obj/crc32_sctp_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_sctp_avx512.asm 00:01:07.567 nasm -MD obj/crc32_lte_avx512.d -MT obj/crc32_lte_avx512.o -o obj/crc32_lte_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_lte_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/crc32_refl_by16_vclmul_avx512.o.tmp obj/crc32_refl_by16_vclmul_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_sha_224_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_224_submit_avx512.o 00:01:07.567 mv obj/mb_mgr_hmac_sha_512_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_512_submit_avx512.o 00:01:07.567 nasm -MD obj/crc32_fp_avx512.d -MT obj/crc32_fp_avx512.o -o obj/crc32_fp_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_fp_avx512.asm 00:01:07.567 nasm -MD obj/crc32_iuup_avx512.d -MT obj/crc32_iuup_avx512.o -o obj/crc32_iuup_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_iuup_avx512.asm 00:01:07.567 mv obj/crc32_refl_by16_vclmul_avx512.o.tmp obj/crc32_refl_by16_vclmul_avx512.o 00:01:07.567 nasm -MD obj/crc32_wimax_avx512.d -MT obj/crc32_wimax_avx512.o -o obj/crc32_wimax_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/crc32_wimax_avx512.asm 00:01:07.567 nasm -MD obj/gcm128_vaes_avx512.d -MT obj/gcm128_vaes_avx512.o -o obj/gcm128_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm128_vaes_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/crc32_by16_vclmul_avx512.o.tmp obj/crc32_by16_vclmul_avx512.o 00:01:07.567 nasm -MD obj/gcm192_vaes_avx512.d -MT obj/gcm192_vaes_avx512.o -o obj/gcm192_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm192_vaes_avx512.asm 00:01:07.567 ld -r -z ibt -z shstk -o obj/crc32_sctp_avx512.o.tmp obj/crc32_sctp_avx512.o 00:01:07.567 nasm -MD obj/gcm256_vaes_avx512.d -MT obj/gcm256_vaes_avx512.o -o obj/gcm256_vaes_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm256_vaes_avx512.asm 00:01:07.568 ld -r -z ibt -z shstk -o obj/crc32_lte_avx512.o.tmp obj/crc32_lte_avx512.o 00:01:07.568 nasm -MD obj/gcm128_avx512.d -MT obj/gcm128_avx512.o -o obj/gcm128_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm128_avx512.asm 00:01:07.568 mv obj/crc32_by16_vclmul_avx512.o.tmp obj/crc32_by16_vclmul_avx512.o 00:01:07.568 mv obj/crc32_sctp_avx512.o.tmp obj/crc32_sctp_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/crc32_fp_avx512.o.tmp obj/crc32_fp_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/crc32_iuup_avx512.o.tmp obj/crc32_iuup_avx512.o 00:01:07.568 mv obj/crc32_lte_avx512.o.tmp obj/crc32_lte_avx512.o 00:01:07.568 nasm -MD obj/gcm192_avx512.d -MT obj/gcm192_avx512.o -o obj/gcm192_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm192_avx512.asm 00:01:07.568 ld -r -z ibt -z shstk -o obj/crc32_wimax_avx512.o.tmp obj/crc32_wimax_avx512.o 00:01:07.568 mv obj/crc32_fp_avx512.o.tmp obj/crc32_fp_avx512.o 00:01:07.568 mv obj/crc32_iuup_avx512.o.tmp obj/crc32_iuup_avx512.o 00:01:07.568 nasm -MD obj/gcm256_avx512.d -MT obj/gcm256_avx512.o -o obj/gcm256_avx512.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP avx512/gcm256_avx512.asm 00:01:07.568 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/mb_mgr_avx.c -o obj/mb_mgr_avx.o 00:01:07.568 mv obj/crc32_wimax_avx512.o.tmp obj/crc32_wimax_avx512.o 00:01:07.568 gcc -MMD -march=haswell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx2/mb_mgr_avx2.c -o obj/mb_mgr_avx2.o 00:01:07.568 gcc -MMD -march=broadwell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx512/mb_mgr_avx512.c -o obj/mb_mgr_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_des_avx512.o.tmp obj/mb_mgr_des_avx512.o 00:01:07.568 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/mb_mgr_sse.c -o obj/mb_mgr_sse.o 00:01:07.568 mv obj/mb_mgr_des_avx512.o.tmp obj/mb_mgr_des_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/md5_x8x2_avx2.o.tmp obj/md5_x8x2_avx2.o 00:01:07.568 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/mb_mgr_sse_no_aesni.c -o obj/mb_mgr_sse_no_aesni.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o 00:01:07.568 mv obj/md5_x8x2_avx2.o.tmp obj/md5_x8x2_avx2.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/alloc.c -o obj/alloc.o 00:01:07.568 mv obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/aes_xcbc_expand_key.c -o obj/aes_xcbc_expand_key.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_gfni_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o 00:01:07.568 mv obj/mb_mgr_zuc_submit_flush_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_avx512.o 00:01:07.568 mv obj/mb_mgr_zuc_submit_flush_gfni_avx512.o.tmp obj/mb_mgr_zuc_submit_flush_gfni_avx512.o 00:01:07.568 mv obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o 00:01:07.568 mv obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/md5_one_block.c -o obj/md5_one_block.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o 00:01:07.568 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/sha_sse.c -o obj/sha_sse.o 00:01:07.568 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/sha_avx.c -o obj/sha_avx.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_flush_avx512.o.tmp obj/mb_mgr_aes256_flush_avx512.o 00:01:07.568 mv obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o 00:01:07.568 mv obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/des_key.c -o obj/des_key.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/des_basic.c -o obj/des_basic.o 00:01:07.568 mv obj/mb_mgr_aes256_flush_avx512.o.tmp obj/mb_mgr_aes256_flush_avx512.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/version.c -o obj/version.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/cpu_feature.c -o obj/cpu_feature.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o 00:01:07.568 mv obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o.tmp obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_hmac_sha_256_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx512.o 00:01:07.568 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/aesni_emu.c -o obj/aesni_emu.o 00:01:07.568 mv obj/mb_mgr_hmac_sha_256_submit_avx512.o.tmp obj/mb_mgr_hmac_sha_256_submit_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/aes128_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbc_dec_by4_sse_no_aesni.o 00:01:07.568 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/kasumi_avx.c -o obj/kasumi_avx.o 00:01:07.568 mv obj/aes128_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes128_cbc_dec_by4_sse_no_aesni.o 00:01:07.568 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/kasumi_iv.c -o obj/kasumi_iv.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o 00:01:07.568 ld -r -z ibt -z shstk -o obj/aes256_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes256_cbc_dec_by4_sse_no_aesni.o 00:01:07.569 mv obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o 00:01:07.834 mv obj/aes256_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes256_cbc_dec_by4_sse_no_aesni.o 00:01:07.834 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/kasumi_sse.c -o obj/kasumi_sse.o 00:01:07.834 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/zuc_sse_top.c -o obj/zuc_sse_top.o 00:01:07.834 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/zuc_sse_no_aesni_top.c -o obj/zuc_sse_no_aesni_top.o 00:01:07.834 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/zuc_avx_top.c -o obj/zuc_avx_top.o 00:01:07.834 gcc -MMD -march=haswell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx2/zuc_avx2_top.c -o obj/zuc_avx2_top.o 00:01:07.834 ld -r -z ibt -z shstk -o obj/aes_cbcs_dec_vaes_avx512.o.tmp obj/aes_cbcs_dec_vaes_avx512.o 00:01:07.834 ld -r -z ibt -z shstk -o obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o 00:01:07.834 gcc -MMD -march=broadwell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx512/zuc_avx512_top.c -o obj/zuc_avx512_top.o 00:01:07.834 mv obj/aes_cbcs_dec_vaes_avx512.o.tmp obj/aes_cbcs_dec_vaes_avx512.o 00:01:07.835 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/zuc_iv.c -o obj/zuc_iv.o 00:01:07.835 mv obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o.tmp obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o 00:01:07.835 gcc -MMD -march=nehalem -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC sse/snow3g_sse.c -o obj/snow3g_sse.o 00:01:07.835 gcc -MMD -march=nehalem -mno-pclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC -O1 no-aesni/snow3g_sse_no_aesni.c -o obj/snow3g_sse_no_aesni.o 00:01:07.835 ld -r -z ibt -z shstk -o obj/poly_fma_avx512.o.tmp obj/poly_fma_avx512.o 00:01:07.835 mv obj/poly_fma_avx512.o.tmp obj/poly_fma_avx512.o 00:01:07.835 gcc -MMD -march=sandybridge -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx/snow3g_avx.c -o obj/snow3g_avx.o 00:01:07.835 ld -r -z ibt -z shstk -o obj/poly_avx512.o.tmp obj/poly_avx512.o 00:01:07.835 mv obj/poly_avx512.o.tmp obj/poly_avx512.o 00:01:07.835 gcc -MMD -march=haswell -maes -mpclmul -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC avx2/snow3g_avx2.c -o obj/snow3g_avx2.o 00:01:07.835 ld -r -z ibt -z shstk -o obj/aes192_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes192_cbc_dec_by4_sse_no_aesni.o 00:01:07.835 mv obj/aes192_cbc_dec_by4_sse_no_aesni.o.tmp obj/aes192_cbc_dec_by4_sse_no_aesni.o 00:01:07.835 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/snow3g_tables.c -o obj/snow3g_tables.o 00:01:07.835 ld -r -z ibt -z shstk -o obj/aes_docsis_dec_avx512.o.tmp obj/aes_docsis_dec_avx512.o 00:01:08.094 ld -r -z ibt -z shstk -o obj/aes_cbc_enc_vaes_avx512.o.tmp obj/aes_cbc_enc_vaes_avx512.o 00:01:08.094 mv obj/aes_docsis_dec_avx512.o.tmp obj/aes_docsis_dec_avx512.o 00:01:08.094 mv obj/aes_cbc_enc_vaes_avx512.o.tmp obj/aes_cbc_enc_vaes_avx512.o 00:01:08.094 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/snow3g_iv.c -o obj/snow3g_iv.o 00:01:08.094 ld -r -z ibt -z shstk -o obj/zuc_common.o.tmp obj/zuc_common.o 00:01:08.094 nasm -MD obj/snow_v_sse.d -MT obj/snow_v_sse.o -o obj/snow_v_sse.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP sse/snow_v_sse.asm 00:01:08.094 mv obj/zuc_common.o.tmp obj/zuc_common.o 00:01:08.094 ld -r -z ibt -z shstk -o obj/mb_mgr_zuc_submit_flush_avx2.o.tmp obj/mb_mgr_zuc_submit_flush_avx2.o 00:01:08.094 nasm -MD obj/snow_v_sse_noaesni.d -MT obj/snow_v_sse_noaesni.o -o obj/snow_v_sse_noaesni.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ -I./ -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP no-aesni/snow_v_sse_noaesni.asm 00:01:08.094 mv obj/mb_mgr_zuc_submit_flush_avx2.o.tmp obj/mb_mgr_zuc_submit_flush_avx2.o 00:01:08.094 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/mb_mgr_auto.c -o obj/mb_mgr_auto.o 00:01:08.094 ld -r -z ibt -z shstk -o obj/snow_v_sse.o.tmp obj/snow_v_sse.o 00:01:08.094 mv obj/snow_v_sse.o.tmp obj/snow_v_sse.o 00:01:08.094 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/error.c -o obj/error.o 00:01:08.094 gcc -MMD -msse4.2 -c -DLINUX -DNO_COMPAT_IMB_API_053 -fPIC -I include -I . -I no-aesni -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -fstack-protector -D_FORTIFY_SOURCE=2 -DSAFE_DATA -DSAFE_PARAM -DSAFE_LOOKUP -O3 -fPIC x86_64/gcm.c -o obj/gcm.o 00:01:08.094 ld -r -z ibt -z shstk -o obj/aes256_cbc_mac_x4_no_aesni.o.tmp obj/aes256_cbc_mac_x4_no_aesni.o 00:01:08.094 mv obj/aes256_cbc_mac_x4_no_aesni.o.tmp obj/aes256_cbc_mac_x4_no_aesni.o 00:01:08.352 ld -r -z ibt -z shstk -o obj/aes_docsis_enc_avx512.o.tmp obj/aes_docsis_enc_avx512.o 00:01:08.352 mv obj/aes_docsis_enc_avx512.o.tmp obj/aes_docsis_enc_avx512.o 00:01:08.352 ld -r -z ibt -z shstk -o obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o.tmp obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o 00:01:08.352 mv obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o.tmp obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o 00:01:08.352 ld -r -z ibt -z shstk -o obj/snow_v_sse_noaesni.o.tmp obj/snow_v_sse_noaesni.o 00:01:08.352 mv obj/snow_v_sse_noaesni.o.tmp obj/snow_v_sse_noaesni.o 00:01:08.609 ld -r -z ibt -z shstk -o obj/zuc_sse_gfni.o.tmp obj/zuc_sse_gfni.o 00:01:08.609 mv obj/zuc_sse_gfni.o.tmp obj/zuc_sse_gfni.o 00:01:08.609 ld -r -z ibt -z shstk -o obj/aes_cbc_dec_vaes_avx512.o.tmp obj/aes_cbc_dec_vaes_avx512.o 00:01:08.609 mv obj/aes_cbc_dec_vaes_avx512.o.tmp obj/aes_cbc_dec_vaes_avx512.o 00:01:08.868 ld -r -z ibt -z shstk -o obj/aes_docsis_enc_vaes_avx512.o.tmp obj/aes_docsis_enc_vaes_avx512.o 00:01:08.868 ld -r -z ibt -z shstk -o obj/zuc_sse.o.tmp obj/zuc_sse.o 00:01:08.868 mv obj/aes_docsis_enc_vaes_avx512.o.tmp obj/aes_docsis_enc_vaes_avx512.o 00:01:08.868 mv obj/zuc_sse.o.tmp obj/zuc_sse.o 00:01:08.868 ld -r -z ibt -z shstk -o obj/chacha20_avx2.o.tmp obj/chacha20_avx2.o 00:01:08.868 mv obj/chacha20_avx2.o.tmp obj/chacha20_avx2.o 00:01:09.166 ld -r -z ibt -z shstk -o obj/chacha20_avx.o.tmp obj/chacha20_avx.o 00:01:09.166 mv obj/chacha20_avx.o.tmp obj/chacha20_avx.o 00:01:09.166 ld -r -z ibt -z shstk -o obj/pon_sse_no_aesni.o.tmp obj/pon_sse_no_aesni.o 00:01:09.424 mv obj/pon_sse_no_aesni.o.tmp obj/pon_sse_no_aesni.o 00:01:09.424 ld -r -z ibt -z shstk -o obj/gcm192_sse.o.tmp obj/gcm192_sse.o 00:01:09.424 mv obj/gcm192_sse.o.tmp obj/gcm192_sse.o 00:01:09.424 ld -r -z ibt -z shstk -o obj/gcm128_avx_gen2.o.tmp obj/gcm128_avx_gen2.o 00:01:09.424 mv obj/gcm128_avx_gen2.o.tmp obj/gcm128_avx_gen2.o 00:01:09.424 ld -r -z ibt -z shstk -o obj/zuc_sse_no_aesni.o.tmp obj/zuc_sse_no_aesni.o 00:01:09.682 mv obj/zuc_sse_no_aesni.o.tmp obj/zuc_sse_no_aesni.o 00:01:09.682 ld -r -z ibt -z shstk -o obj/gcm128_sse.o.tmp obj/gcm128_sse.o 00:01:09.682 mv obj/gcm128_sse.o.tmp obj/gcm128_sse.o 00:01:09.682 ld -r -z ibt -z shstk -o obj/zuc_avx.o.tmp obj/zuc_avx.o 00:01:09.682 ld -r -z ibt -z shstk -o obj/aes128_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes128_cntr_ccm_by8_sse_no_aesni.o 00:01:09.682 mv obj/zuc_avx.o.tmp obj/zuc_avx.o 00:01:09.682 mv obj/aes128_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes128_cntr_ccm_by8_sse_no_aesni.o 00:01:09.682 ld -r -z ibt -z shstk -o obj/gcm256_sse.o.tmp obj/gcm256_sse.o 00:01:09.682 mv obj/gcm256_sse.o.tmp obj/gcm256_sse.o 00:01:10.249 ld -r -z ibt -z shstk -o obj/gcm192_avx_gen2.o.tmp obj/gcm192_avx_gen2.o 00:01:10.249 mv obj/gcm192_avx_gen2.o.tmp obj/gcm192_avx_gen2.o 00:01:10.249 ld -r -z ibt -z shstk -o obj/gcm256_avx_gen2.o.tmp obj/gcm256_avx_gen2.o 00:01:10.249 mv obj/gcm256_avx_gen2.o.tmp obj/gcm256_avx_gen2.o 00:01:10.508 ld -r -z ibt -z shstk -o obj/aes256_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes256_cntr_ccm_by8_sse_no_aesni.o 00:01:10.508 mv obj/aes256_cntr_ccm_by8_sse_no_aesni.o.tmp obj/aes256_cntr_ccm_by8_sse_no_aesni.o 00:01:10.766 ld -r -z ibt -z shstk -o obj/gcm128_avx512.o.tmp obj/gcm128_avx512.o 00:01:10.766 mv obj/gcm128_avx512.o.tmp obj/gcm128_avx512.o 00:01:10.766 ld -r -z ibt -z shstk -o obj/aes_docsis_dec_vaes_avx512.o.tmp obj/aes_docsis_dec_vaes_avx512.o 00:01:10.766 mv obj/aes_docsis_dec_vaes_avx512.o.tmp obj/aes_docsis_dec_vaes_avx512.o 00:01:10.766 ld -r -z ibt -z shstk -o obj/gcm256_avx512.o.tmp obj/gcm256_avx512.o 00:01:10.766 mv obj/gcm256_avx512.o.tmp obj/gcm256_avx512.o 00:01:11.024 ld -r -z ibt -z shstk -o obj/cntr_ccm_vaes_avx512.o.tmp obj/cntr_ccm_vaes_avx512.o 00:01:11.024 mv obj/cntr_ccm_vaes_avx512.o.tmp obj/cntr_ccm_vaes_avx512.o 00:01:11.282 ld -r -z ibt -z shstk -o obj/chacha20_avx512.o.tmp obj/chacha20_avx512.o 00:01:11.282 mv obj/chacha20_avx512.o.tmp obj/chacha20_avx512.o 00:01:11.282 ld -r -z ibt -z shstk -o obj/gcm192_avx512.o.tmp obj/gcm192_avx512.o 00:01:11.282 mv obj/gcm192_avx512.o.tmp obj/gcm192_avx512.o 00:01:11.540 ld -r -z ibt -z shstk -o obj/aes_ecb_by4_sse_no_aesni.o.tmp obj/aes_ecb_by4_sse_no_aesni.o 00:01:11.540 mv obj/aes_ecb_by4_sse_no_aesni.o.tmp obj/aes_ecb_by4_sse_no_aesni.o 00:01:11.798 ld -r -z ibt -z shstk -o obj/gcm192_avx_gen4.o.tmp obj/gcm192_avx_gen4.o 00:01:11.798 mv obj/gcm192_avx_gen4.o.tmp obj/gcm192_avx_gen4.o 00:01:12.056 ld -r -z ibt -z shstk -o obj/gcm128_avx_gen4.o.tmp obj/gcm128_avx_gen4.o 00:01:12.056 mv obj/gcm128_avx_gen4.o.tmp obj/gcm128_avx_gen4.o 00:01:12.056 ld -r -z ibt -z shstk -o obj/aes128_cntr_by8_sse_no_aesni.o.tmp obj/aes128_cntr_by8_sse_no_aesni.o 00:01:12.056 mv obj/aes128_cntr_by8_sse_no_aesni.o.tmp obj/aes128_cntr_by8_sse_no_aesni.o 00:01:12.315 ld -r -z ibt -z shstk -o obj/gcm256_avx_gen4.o.tmp obj/gcm256_avx_gen4.o 00:01:12.315 mv obj/gcm256_avx_gen4.o.tmp obj/gcm256_avx_gen4.o 00:01:12.315 ld -r -z ibt -z shstk -o obj/aes192_cntr_by8_sse_no_aesni.o.tmp obj/aes192_cntr_by8_sse_no_aesni.o 00:01:12.315 mv obj/aes192_cntr_by8_sse_no_aesni.o.tmp obj/aes192_cntr_by8_sse_no_aesni.o 00:01:13.690 ld -r -z ibt -z shstk -o obj/des_x16_avx512.o.tmp obj/des_x16_avx512.o 00:01:13.690 mv obj/des_x16_avx512.o.tmp obj/des_x16_avx512.o 00:01:14.627 ld -r -z ibt -z shstk -o obj/zuc_avx2.o.tmp obj/zuc_avx2.o 00:01:14.627 mv obj/zuc_avx2.o.tmp obj/zuc_avx2.o 00:01:14.627 ld -r -z ibt -z shstk -o obj/aes256_cntr_by8_sse_no_aesni.o.tmp obj/aes256_cntr_by8_sse_no_aesni.o 00:01:14.627 mv obj/aes256_cntr_by8_sse_no_aesni.o.tmp obj/aes256_cntr_by8_sse_no_aesni.o 00:01:14.886 ld -r -z ibt -z shstk -o obj/zuc_avx512.o.tmp obj/zuc_avx512.o 00:01:14.886 mv obj/zuc_avx512.o.tmp obj/zuc_avx512.o 00:01:15.454 ld -r -z ibt -z shstk -o obj/chacha20_sse.o.tmp obj/chacha20_sse.o 00:01:15.454 mv obj/chacha20_sse.o.tmp obj/chacha20_sse.o 00:01:17.358 ld -r -z ibt -z shstk -o obj/gcm128_vaes_avx512.o.tmp obj/gcm128_vaes_avx512.o 00:01:17.358 mv obj/gcm128_vaes_avx512.o.tmp obj/gcm128_vaes_avx512.o 00:01:21.598 ld -r -z ibt -z shstk -o obj/gcm192_vaes_avx512.o.tmp obj/gcm192_vaes_avx512.o 00:01:21.598 mv obj/gcm192_vaes_avx512.o.tmp obj/gcm192_vaes_avx512.o 00:01:21.857 ld -r -z ibt -z shstk -o obj/gcm256_vaes_avx512.o.tmp obj/gcm256_vaes_avx512.o 00:01:21.857 mv obj/gcm256_vaes_avx512.o.tmp obj/gcm256_vaes_avx512.o 00:01:28.425 ld -r -z ibt -z shstk -o obj/cntr_vaes_avx512.o.tmp obj/cntr_vaes_avx512.o 00:01:28.425 mv obj/cntr_vaes_avx512.o.tmp obj/cntr_vaes_avx512.o 00:02:36.147 ld -r -z ibt -z shstk -o obj/gcm128_sse_no_aesni.o.tmp obj/gcm128_sse_no_aesni.o 00:02:36.147 mv obj/gcm128_sse_no_aesni.o.tmp obj/gcm128_sse_no_aesni.o 00:03:02.690 ld -r -z ibt -z shstk -o obj/gcm256_sse_no_aesni.o.tmp obj/gcm256_sse_no_aesni.o 00:03:02.690 mv obj/gcm256_sse_no_aesni.o.tmp obj/gcm256_sse_no_aesni.o 00:05:09.254 ld -r -z ibt -z shstk -o obj/gcm192_sse_no_aesni.o.tmp obj/gcm192_sse_no_aesni.o 00:05:09.254 mv obj/gcm192_sse_no_aesni.o.tmp obj/gcm192_sse_no_aesni.o 00:05:09.254 gcc -shared -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -Wl,-soname,libIPSec_MB.so.1 -o libIPSec_MB.so.1.0.0 obj/aes_keyexp_128.o obj/aes_keyexp_192.o obj/aes_keyexp_256.o obj/aes_cmac_subkey_gen.o obj/save_xmms.o obj/clear_regs_mem_fns.o obj/const.o obj/aes128_ecbenc_x3.o obj/zuc_common.o obj/wireless_common.o obj/constant_lookup.o obj/crc32_refl_const.o obj/crc32_const.o obj/poly1305.o obj/chacha20_poly1305.o obj/aes128_cbc_dec_by4_sse_no_aesni.o obj/aes192_cbc_dec_by4_sse_no_aesni.o obj/aes256_cbc_dec_by4_sse_no_aesni.o obj/aes_cbc_enc_128_x4_no_aesni.o obj/aes_cbc_enc_192_x4_no_aesni.o obj/aes_cbc_enc_256_x4_no_aesni.o obj/aes128_cntr_by8_sse_no_aesni.o obj/aes192_cntr_by8_sse_no_aesni.o obj/aes256_cntr_by8_sse_no_aesni.o obj/aes_ecb_by4_sse_no_aesni.o obj/aes128_cntr_ccm_by8_sse_no_aesni.o obj/aes256_cntr_ccm_by8_sse_no_aesni.o obj/pon_sse_no_aesni.o obj/zuc_sse_no_aesni.o obj/aes_cfb_sse_no_aesni.o obj/aes128_cbc_mac_x4_no_aesni.o obj/aes256_cbc_mac_x4_no_aesni.o obj/aes_xcbc_mac_128_x4_no_aesni.o obj/mb_mgr_aes_flush_sse_no_aesni.o obj/mb_mgr_aes_submit_sse_no_aesni.o obj/mb_mgr_aes192_flush_sse_no_aesni.o obj/mb_mgr_aes192_submit_sse_no_aesni.o obj/mb_mgr_aes256_flush_sse_no_aesni.o obj/mb_mgr_aes256_submit_sse_no_aesni.o obj/mb_mgr_aes_cmac_submit_flush_sse_no_aesni.o obj/mb_mgr_aes256_cmac_submit_flush_sse_no_aesni.o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_no_aesni.o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_no_aesni.o obj/mb_mgr_aes_xcbc_flush_sse_no_aesni.o obj/mb_mgr_aes_xcbc_submit_sse_no_aesni.o obj/mb_mgr_zuc_submit_flush_sse_no_aesni.o obj/ethernet_fcs_sse_no_aesni.o obj/crc16_x25_sse_no_aesni.o obj/aes_cbcs_1_9_enc_128_x4_no_aesni.o obj/aes128_cbcs_1_9_dec_by4_sse_no_aesni.o obj/mb_mgr_aes128_cbcs_1_9_submit_sse.o obj/mb_mgr_aes128_cbcs_1_9_flush_sse.o obj/mb_mgr_aes128_cbcs_1_9_submit_sse_no_aesni.o obj/mb_mgr_aes128_cbcs_1_9_flush_sse_no_aesni.o obj/crc32_refl_by8_sse_no_aesni.o obj/crc32_by8_sse_no_aesni.o obj/crc32_sctp_sse_no_aesni.o obj/crc32_lte_sse_no_aesni.o obj/crc32_fp_sse_no_aesni.o obj/crc32_iuup_sse_no_aesni.o obj/crc32_wimax_sse_no_aesni.o obj/gcm128_sse_no_aesni.o obj/gcm192_sse_no_aesni.o obj/gcm256_sse_no_aesni.o obj/aes128_cbc_dec_by4_sse.o obj/aes128_cbc_dec_by8_sse.o obj/aes192_cbc_dec_by4_sse.o obj/aes192_cbc_dec_by8_sse.o obj/aes256_cbc_dec_by4_sse.o obj/aes256_cbc_dec_by8_sse.o obj/aes_cbc_enc_128_x4.o obj/aes_cbc_enc_192_x4.o obj/aes_cbc_enc_256_x4.o obj/aes_cbc_enc_128_x8_sse.o obj/aes_cbc_enc_192_x8_sse.o obj/aes_cbc_enc_256_x8_sse.o obj/pon_sse.o obj/aes128_cntr_by8_sse.o obj/aes192_cntr_by8_sse.o obj/aes256_cntr_by8_sse.o obj/aes_ecb_by4_sse.o obj/aes128_cntr_ccm_by8_sse.o obj/aes256_cntr_ccm_by8_sse.o obj/aes_cfb_sse.o obj/aes128_cbc_mac_x4.o obj/aes256_cbc_mac_x4.o obj/aes128_cbc_mac_x8_sse.o obj/aes256_cbc_mac_x8_sse.o obj/aes_xcbc_mac_128_x4.o obj/md5_x4x2_sse.o obj/sha1_mult_sse.o obj/sha1_one_block_sse.o obj/sha224_one_block_sse.o obj/sha256_one_block_sse.o obj/sha384_one_block_sse.o obj/sha512_one_block_sse.o obj/sha512_x2_sse.o obj/sha_256_mult_sse.o obj/sha1_ni_x2_sse.o obj/sha256_ni_x2_sse.o obj/zuc_sse.o obj/zuc_sse_gfni.o obj/mb_mgr_aes_flush_sse.o obj/mb_mgr_aes_submit_sse.o obj/mb_mgr_aes192_flush_sse.o obj/mb_mgr_aes192_submit_sse.o obj/mb_mgr_aes256_flush_sse.o obj/mb_mgr_aes256_submit_sse.o obj/mb_mgr_aes_flush_sse_x8.o obj/mb_mgr_aes_submit_sse_x8.o obj/mb_mgr_aes192_flush_sse_x8.o obj/mb_mgr_aes192_submit_sse_x8.o obj/mb_mgr_aes256_flush_sse_x8.o obj/mb_mgr_aes256_submit_sse_x8.o obj/mb_mgr_aes_cmac_submit_flush_sse.o obj/mb_mgr_aes256_cmac_submit_flush_sse.o obj/mb_mgr_aes_cmac_submit_flush_sse_x8.o obj/mb_mgr_aes256_cmac_submit_flush_sse_x8.o obj/mb_mgr_aes_ccm_auth_submit_flush_sse.o obj/mb_mgr_aes_ccm_auth_submit_flush_sse_x8.o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse.o obj/mb_mgr_aes256_ccm_auth_submit_flush_sse_x8.o obj/mb_mgr_aes_xcbc_flush_sse.o obj/mb_mgr_aes_xcbc_submit_sse.o obj/mb_mgr_hmac_md5_flush_sse.o obj/mb_mgr_hmac_md5_submit_sse.o obj/mb_mgr_hmac_flush_sse.o obj/mb_mgr_hmac_submit_sse.o obj/mb_mgr_hmac_sha_224_flush_sse.o obj/mb_mgr_hmac_sha_224_submit_sse.o obj/mb_mgr_hmac_sha_256_flush_sse.o obj/mb_mgr_hmac_sha_256_submit_sse.o obj/mb_mgr_hmac_sha_384_flush_sse.o obj/mb_mgr_hmac_sha_384_submit_sse.o obj/mb_mgr_hmac_sha_512_flush_sse.o obj/mb_mgr_hmac_sha_512_submit_sse.o obj/mb_mgr_hmac_flush_ni_sse.o obj/mb_mgr_hmac_submit_ni_sse.o obj/mb_mgr_hmac_sha_224_flush_ni_sse.o obj/mb_mgr_hmac_sha_224_submit_ni_sse.o obj/mb_mgr_hmac_sha_256_flush_ni_sse.o obj/mb_mgr_hmac_sha_256_submit_ni_sse.o obj/mb_mgr_zuc_submit_flush_sse.o obj/mb_mgr_zuc_submit_flush_gfni_sse.o obj/ethernet_fcs_sse.o obj/crc16_x25_sse.o obj/crc32_sctp_sse.o obj/aes_cbcs_1_9_enc_128_x4.o obj/aes128_cbcs_1_9_dec_by4_sse.o obj/crc32_refl_by8_sse.o obj/crc32_by8_sse.o obj/crc32_lte_sse.o obj/crc32_fp_sse.o obj/crc32_iuup_sse.o obj/crc32_wimax_sse.o obj/chacha20_sse.o obj/memcpy_sse.o obj/gcm128_sse.o obj/gcm192_sse.o obj/gcm256_sse.o obj/aes_cbc_enc_128_x8.o obj/aes_cbc_enc_192_x8.o obj/aes_cbc_enc_256_x8.o obj/aes128_cbc_dec_by8_avx.o obj/aes192_cbc_dec_by8_avx.o obj/aes256_cbc_dec_by8_avx.o obj/pon_avx.o obj/aes128_cntr_by8_avx.o obj/aes192_cntr_by8_avx.o obj/aes256_cntr_by8_avx.o obj/aes128_cntr_ccm_by8_avx.o obj/aes256_cntr_ccm_by8_avx.o obj/aes_ecb_by4_avx.o obj/aes_cfb_avx.o obj/aes128_cbc_mac_x8.o obj/aes256_cbc_mac_x8.o obj/aes_xcbc_mac_128_x8.o obj/md5_x4x2_avx.o obj/sha1_mult_avx.o obj/sha1_one_block_avx.o obj/sha224_one_block_avx.o obj/sha256_one_block_avx.o obj/sha_256_mult_avx.o obj/sha384_one_block_avx.o obj/sha512_one_block_avx.o obj/sha512_x2_avx.o obj/zuc_avx.o obj/mb_mgr_aes_flush_avx.o obj/mb_mgr_aes_submit_avx.o obj/mb_mgr_aes192_flush_avx.o obj/mb_mgr_aes192_submit_avx.o obj/mb_mgr_aes256_flush_avx.o obj/mb_mgr_aes256_submit_avx.o obj/mb_mgr_aes_cmac_submit_flush_avx.o obj/mb_mgr_aes256_cmac_submit_flush_avx.o obj/mb_mgr_aes_ccm_auth_submit_flush_avx.o obj/mb_mgr_aes256_ccm_auth_submit_flush_avx.o obj/mb_mgr_aes_xcbc_flush_avx.o obj/mb_mgr_aes_xcbc_submit_avx.o obj/mb_mgr_hmac_md5_flush_avx.o obj/mb_mgr_hmac_md5_submit_avx.o obj/mb_mgr_hmac_flush_avx.o obj/mb_mgr_hmac_submit_avx.o obj/mb_mgr_hmac_sha_224_flush_avx.o obj/mb_mgr_hmac_sha_224_submit_avx.o obj/mb_mgr_hmac_sha_256_flush_avx.o obj/mb_mgr_hmac_sha_256_submit_avx.o obj/mb_mgr_hmac_sha_384_flush_avx.o obj/mb_mgr_hmac_sha_384_submit_avx.o obj/mb_mgr_hmac_sha_512_flush_avx.o obj/mb_mgr_hmac_sha_512_submit_avx.o obj/mb_mgr_zuc_submit_flush_avx.o obj/ethernet_fcs_avx.o obj/crc16_x25_avx.o obj/aes_cbcs_1_9_enc_128_x8.o obj/aes128_cbcs_1_9_dec_by8_avx.o obj/mb_mgr_aes128_cbcs_1_9_submit_avx.o obj/mb_mgr_aes128_cbcs_1_9_flush_avx.o obj/crc32_refl_by8_avx.o obj/crc32_by8_avx.o obj/crc32_sctp_avx.o obj/crc32_lte_avx.o obj/crc32_fp_avx.o obj/crc32_iuup_avx.o obj/crc32_wimax_avx.o obj/chacha20_avx.o obj/memcpy_avx.o obj/gcm128_avx_gen2.o obj/gcm192_avx_gen2.o obj/gcm256_avx_gen2.o obj/md5_x8x2_avx2.o obj/sha1_x8_avx2.o obj/sha256_oct_avx2.o obj/sha512_x4_avx2.o obj/zuc_avx2.o obj/mb_mgr_hmac_md5_flush_avx2.o obj/mb_mgr_hmac_md5_submit_avx2.o obj/mb_mgr_hmac_flush_avx2.o obj/mb_mgr_hmac_submit_avx2.o obj/mb_mgr_hmac_sha_224_flush_avx2.o obj/mb_mgr_hmac_sha_224_submit_avx2.o obj/mb_mgr_hmac_sha_256_flush_avx2.o obj/mb_mgr_hmac_sha_256_submit_avx2.o obj/mb_mgr_hmac_sha_384_flush_avx2.o obj/mb_mgr_hmac_sha_384_submit_avx2.o obj/mb_mgr_hmac_sha_512_flush_avx2.o obj/mb_mgr_hmac_sha_512_submit_avx2.o obj/mb_mgr_zuc_submit_flush_avx2.o obj/chacha20_avx2.o obj/gcm128_avx_gen4.o obj/gcm192_avx_gen4.o obj/gcm256_avx_gen4.o obj/sha1_x16_avx512.o obj/sha256_x16_avx512.o obj/sha512_x8_avx512.o obj/des_x16_avx512.o obj/cntr_vaes_avx512.o obj/cntr_ccm_vaes_avx512.o obj/aes_cbc_dec_vaes_avx512.o obj/aes_cbc_enc_vaes_avx512.o obj/aes_cbcs_enc_vaes_avx512.o obj/aes_cbcs_dec_vaes_avx512.o obj/aes_docsis_dec_avx512.o obj/aes_docsis_enc_avx512.o obj/aes_docsis_dec_vaes_avx512.o obj/aes_docsis_enc_vaes_avx512.o obj/zuc_avx512.o obj/mb_mgr_aes_submit_avx512.o obj/mb_mgr_aes_flush_avx512.o obj/mb_mgr_aes192_submit_avx512.o obj/mb_mgr_aes192_flush_avx512.o obj/mb_mgr_aes256_submit_avx512.o obj/mb_mgr_aes256_flush_avx512.o obj/mb_mgr_hmac_flush_avx512.o obj/mb_mgr_hmac_submit_avx512.o obj/mb_mgr_hmac_sha_224_flush_avx512.o obj/mb_mgr_hmac_sha_224_submit_avx512.o obj/mb_mgr_hmac_sha_256_flush_avx512.o obj/mb_mgr_hmac_sha_256_submit_avx512.o obj/mb_mgr_hmac_sha_384_flush_avx512.o obj/mb_mgr_hmac_sha_384_submit_avx512.o obj/mb_mgr_hmac_sha_512_flush_avx512.o obj/mb_mgr_hmac_sha_512_submit_avx512.o obj/mb_mgr_des_avx512.o obj/mb_mgr_aes_cmac_submit_flush_vaes_avx512.o obj/mb_mgr_aes256_cmac_submit_flush_vaes_avx512.o obj/mb_mgr_aes_ccm_auth_submit_flush_vaes_avx512.o obj/mb_mgr_aes256_ccm_auth_submit_flush_vaes_avx512.o obj/mb_mgr_aes_xcbc_submit_flush_vaes_avx512.o obj/mb_mgr_zuc_submit_flush_avx512.o obj/mb_mgr_zuc_submit_flush_gfni_avx512.o obj/chacha20_avx512.o obj/poly_avx512.o obj/poly_fma_avx512.o obj/ethernet_fcs_avx512.o obj/crc16_x25_avx512.o obj/crc32_refl_by16_vclmul_avx512.o obj/crc32_by16_vclmul_avx512.o obj/mb_mgr_aes_cbcs_1_9_submit_avx512.o obj/mb_mgr_aes_cbcs_1_9_flush_avx512.o obj/crc32_sctp_avx512.o obj/crc32_lte_avx512.o obj/crc32_fp_avx512.o obj/crc32_iuup_avx512.o obj/crc32_wimax_avx512.o obj/gcm128_vaes_avx512.o obj/gcm192_vaes_avx512.o obj/gcm256_vaes_avx512.o obj/gcm128_avx512.o obj/gcm192_avx512.o obj/gcm256_avx512.o obj/mb_mgr_avx.o obj/mb_mgr_avx2.o obj/mb_mgr_avx512.o obj/mb_mgr_sse.o obj/mb_mgr_sse_no_aesni.o obj/alloc.o obj/aes_xcbc_expand_key.o obj/md5_one_block.o obj/sha_sse.o obj/sha_avx.o obj/des_key.o obj/des_basic.o obj/version.o obj/cpu_feature.o obj/aesni_emu.o obj/kasumi_avx.o obj/kasumi_iv.o obj/kasumi_sse.o obj/zuc_sse_top.o obj/zuc_sse_no_aesni_top.o obj/zuc_avx_top.o obj/zuc_avx2_top.o obj/zuc_avx512_top.o obj/zuc_iv.o obj/snow3g_sse.o obj/snow3g_sse_no_aesni.o obj/snow3g_avx.o obj/snow3g_avx2.o obj/snow3g_tables.o obj/snow3g_iv.o obj/snow_v_sse.o obj/snow_v_sse_noaesni.o obj/mb_mgr_auto.o obj/error.o obj/gcm.o -lc 00:05:09.254 ln -f -s libIPSec_MB.so.1.0.0 ./libIPSec_MB.so.1 00:05:09.254 ln -f -s libIPSec_MB.so.1 ./libIPSec_MB.so 00:05:09.254 make[1]: Leaving directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:05:09.254 make -C test 00:05:09.254 make[1]: Entering directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/test' 00:05:09.254 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o main.o main.c 00:05:09.254 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o gcm_test.o gcm_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ctr_test.o ctr_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o customop_test.o customop_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o des_test.o des_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ccm_test.o ccm_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o cmac_test.o cmac_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o utils.o utils.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hmac_sha1_test.o hmac_sha1_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hmac_sha256_sha512_test.o hmac_sha256_sha512_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hmac_md5_test.o hmac_md5_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o aes_test.o aes_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o sha_test.o sha_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o chained_test.o chained_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o api_test.o api_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o pon_test.o pon_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ecb_test.o ecb_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o zuc_test.o zuc_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o kasumi_test.o kasumi_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o snow3g_test.o snow3g_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o direct_api_test.o direct_api_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o clear_mem_test.o clear_mem_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o hec_test.o hec_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o xcbc_test.o xcbc_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o aes_cbcs_test.o aes_cbcs_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o crc_test.o crc_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o chacha_test.o chacha_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o poly1305_test.o poly1305_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o chacha20_poly1305_test.o chacha20_poly1305_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o null_test.o null_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o snow_v_test.o snow_v_test.c 00:05:09.255 gcc -MMD -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -c -o ipsec_xvalid.o ipsec_xvalid.c 00:05:09.255 nasm -MD misc.d -MT misc.o -o misc.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ misc.asm 00:05:09.255 ld -r -z ibt -z shstk -o misc.o.tmp misc.o 00:05:09.255 mv misc.o.tmp misc.o 00:05:09.255 utils.c:166:32: warning: argument 2 of type ‘uint8_t[6]’ {aka ‘unsigned char[6]’} with mismatched bound [-Warray-parameter=] 00:05:09.255 166 | uint8_t arch_support[IMB_ARCH_NUM], 00:05:09.255 | ~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.255 In file included from utils.c:35: 00:05:09.255 utils.h:39:54: note: previously declared as ‘uint8_t *’ {aka ‘unsigned char *’} 00:05:09.256 39 | int update_flags_and_archs(const char *arg, uint8_t *arch_support, 00:05:09.256 | ~~~~~~~~~^~~~~~~~~~~~ 00:05:09.256 utils.c:207:21: warning: argument 1 of type ‘uint8_t[6]’ {aka ‘unsigned char[6]’} with mismatched bound [-Warray-parameter=] 00:05:09.256 207 | detect_arch(uint8_t arch_support[IMB_ARCH_NUM]) 00:05:09.256 | ~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 utils.h:41:26: note: previously declared as ‘uint8_t *’ {aka ‘unsigned char *’} 00:05:09.256 41 | int detect_arch(uint8_t *arch_support); 00:05:09.256 | ~~~~~~~~~^~~~~~~~~~~~ 00:05:09.256 In file included from null_test.c:33: 00:05:09.256 null_test.c: In function ‘test_null_hash’: 00:05:09.256 ../lib/intel-ipsec-mb.h:1235:10: warning: ‘cipher_key’ may be used uninitialized [-Wmaybe-uninitialized] 00:05:09.256 1235 | ((_mgr)->keyexp_128((_raw), (_enc), (_dec))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 null_test.c:62:9: note: in expansion of macro ‘IMB_AES_KEYEXP_128’ 00:05:09.256 62 | IMB_AES_KEYEXP_128(mb_mgr, cipher_key, expkey, dust); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:1235:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, void *, void *)’ 00:05:09.256 1235 | ((_mgr)->keyexp_128((_raw), (_enc), (_dec))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 null_test.c:62:9: note: in expansion of macro ‘IMB_AES_KEYEXP_128’ 00:05:09.256 62 | IMB_AES_KEYEXP_128(mb_mgr, cipher_key, expkey, dust); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 null_test.c:47:33: note: ‘cipher_key’ declared here 00:05:09.256 47 | DECLARE_ALIGNED(uint8_t cipher_key[16], 16); 00:05:09.256 | ^~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:51:9: note: in definition of macro ‘DECLARE_ALIGNED’ 00:05:09.256 51 | decl __attribute__((aligned(alignval))) 00:05:09.256 | ^~~~ 00:05:09.256 gcc -fPIE -z noexecstack -z relro -z now -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -L../lib main.o gcm_test.o ctr_test.o customop_test.o des_test.o ccm_test.o cmac_test.o utils.o hmac_sha1_test.o hmac_sha256_sha512_test.o hmac_md5_test.o aes_test.o sha_test.o chained_test.o api_test.o pon_test.o ecb_test.o zuc_test.o kasumi_test.o snow3g_test.o direct_api_test.o clear_mem_test.o hec_test.o xcbc_test.o aes_cbcs_test.o crc_test.o chacha_test.o poly1305_test.o chacha20_poly1305_test.o null_test.o snow_v_test.o -lIPSec_MB -o ipsec_MB_testapp 00:05:09.256 gcc -fPIE -z noexecstack -z relro -z now -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -L../lib ipsec_xvalid.o utils.o misc.o -lIPSec_MB -o ipsec_xvalid_test 00:05:09.256 make[1]: Leaving directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/test' 00:05:09.256 make -C perf 00:05:09.256 make[1]: Entering directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/perf' 00:05:09.256 gcc -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -pthread -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -fPIE -fstack-protector -D_FORTIFY_SOURCE=2 -c -o ipsec_perf.o ipsec_perf.c 00:05:09.256 gcc -DLINUX -D_GNU_SOURCE -DNO_COMPAT_IMB_API_053 -W -Wall -Wextra -Wmissing-declarations -Wpointer-arith -Wcast-qual -Wundef -Wwrite-strings -Wformat -Wformat-security -Wunreachable-code -Wmissing-noreturn -Wsign-compare -Wno-endif-labels -Wstrict-prototypes -Wmissing-prototypes -Wold-style-definition -pthread -fno-strict-overflow -fno-delete-null-pointer-checks -fwrapv -fcf-protection=full -I../lib/include -I../lib -O3 -fPIE -fstack-protector -D_FORTIFY_SOURCE=2 -c -o msr.o msr.c 00:05:09.256 nasm -MD misc.d -MT misc.o -o misc.o -Werror -felf64 -Xgnu -gdwarf -DLINUX -D__linux__ misc.asm 00:05:09.256 ld -r -z ibt -z shstk -o misc.o.tmp misc.o 00:05:09.256 mv misc.o.tmp misc.o 00:05:09.256 In file included from ipsec_perf.c:59: 00:05:09.256 ipsec_perf.c: In function ‘do_test_gcm’: 00:05:09.256 ../lib/intel-ipsec-mb.h:1382:10: warning: ‘key’ may be used uninitialized [-Wmaybe-uninitialized] 00:05:09.256 1382 | ((_mgr)->gcm128_pre((_key_in), (_key_exp))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 ipsec_perf.c:1937:17: note: in expansion of macro ‘IMB_AES128_GCM_PRE’ 00:05:09.256 1937 | IMB_AES128_GCM_PRE(mb_mgr, key, &gdata_key); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:1382:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, struct gcm_key_data *)’ 00:05:09.256 1382 | ((_mgr)->gcm128_pre((_key_in), (_key_exp))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 ipsec_perf.c:1937:17: note: in expansion of macro ‘IMB_AES128_GCM_PRE’ 00:05:09.256 1937 | IMB_AES128_GCM_PRE(mb_mgr, key, &gdata_key); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:1384:10: warning: ‘key’ may be used uninitialized [-Wmaybe-uninitialized] 00:05:09.256 1384 | ((_mgr)->gcm192_pre((_key_in), (_key_exp))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 ipsec_perf.c:1940:17: note: in expansion of macro ‘IMB_AES192_GCM_PRE’ 00:05:09.256 1940 | IMB_AES192_GCM_PRE(mb_mgr, key, &gdata_key); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:1384:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, struct gcm_key_data *)’ 00:05:09.256 1384 | ((_mgr)->gcm192_pre((_key_in), (_key_exp))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 ipsec_perf.c:1940:17: note: in expansion of macro ‘IMB_AES192_GCM_PRE’ 00:05:09.256 1940 | IMB_AES192_GCM_PRE(mb_mgr, key, &gdata_key); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:1386:10: warning: ‘key’ may be used uninitialized [-Wmaybe-uninitialized] 00:05:09.256 1386 | ((_mgr)->gcm256_pre((_key_in), (_key_exp))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 ipsec_perf.c:1944:17: note: in expansion of macro ‘IMB_AES256_GCM_PRE’ 00:05:09.256 1944 | IMB_AES256_GCM_PRE(mb_mgr, key, &gdata_key); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 ../lib/intel-ipsec-mb.h:1386:10: note: by argument 1 of type ‘const void *’ to ‘void(const void *, struct gcm_key_data *)’ 00:05:09.256 1386 | ((_mgr)->gcm256_pre((_key_in), (_key_exp))) 00:05:09.256 | ~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 00:05:09.256 ipsec_perf.c:1944:17: note: in expansion of macro ‘IMB_AES256_GCM_PRE’ 00:05:09.256 1944 | IMB_AES256_GCM_PRE(mb_mgr, key, &gdata_key); 00:05:09.256 | ^~~~~~~~~~~~~~~~~~ 00:05:09.256 gcc -fPIE -z noexecstack -z relro -z now -pthread -fcf-protection=full -Wl,-z,ibt -Wl,-z,shstk -Wl,-z,cet-report=error -L../lib ipsec_perf.o msr.o misc.o -lIPSec_MB -o ipsec_perf 00:05:09.256 make[1]: Leaving directory '/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/perf' 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@119 -- $ DPDK_DRIVERS+=("crypto") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@120 -- $ DPDK_DRIVERS+=("$intel_ipsec_mb_drv") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@121 -- $ DPDK_DRIVERS+=("crypto/qat") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@122 -- $ DPDK_DRIVERS+=("compress/qat") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@123 -- $ DPDK_DRIVERS+=("common/qat") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@125 -- $ ge 22.11.4 21.11.0 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '>=' 21.11.0 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=>=' 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 22 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@350 -- $ local d=22 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@352 -- $ echo 22 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=22 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:05:09.256 16:58:54 build_native_dpdk -- scripts/common.sh@364 -- $ return 0 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@128 -- $ DPDK_DRIVERS+=("bus/auxiliary") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@129 -- $ DPDK_DRIVERS+=("common/mlx5") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@130 -- $ DPDK_DRIVERS+=("common/mlx5/linux") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@131 -- $ DPDK_DRIVERS+=("crypto/mlx5") 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@132 -- $ mlx5_libs_added=y 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@134 -- $ dpdk_cflags+=' -I/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@135 -- $ dpdk_ldflags+=' -L/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@136 -- $ export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@136 -- $ LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 1 -eq 1 ]] 00:05:09.256 16:58:54 build_native_dpdk -- common/autobuild_common.sh@140 -- $ isal_dir=/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l 00:05:09.257 16:58:54 build_native_dpdk -- common/autobuild_common.sh@141 -- $ git clone --branch v2.29.0 --depth 1 https://github.com/intel/isa-l.git /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l 00:05:09.257 Cloning into '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l'... 00:05:09.257 Note: switching to '806b55ee578efd8158962b90121a4568eb1ecb66'. 00:05:09.257 00:05:09.257 You are in 'detached HEAD' state. You can look around, make experimental 00:05:09.257 changes and commit them, and you can discard any commits you make in this 00:05:09.257 state without impacting any branches by switching back to a branch. 00:05:09.257 00:05:09.257 If you want to create a new branch to retain commits you create, you may 00:05:09.257 do so (now or later) by using -c with the switch command. Example: 00:05:09.257 00:05:09.257 git switch -c 00:05:09.257 00:05:09.257 Or undo this operation with: 00:05:09.257 00:05:09.257 git switch - 00:05:09.257 00:05:09.257 Turn off this advice by setting config variable advice.detachedHead to false 00:05:09.257 00:05:09.257 16:58:55 build_native_dpdk -- common/autobuild_common.sh@143 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l 00:05:09.257 16:58:55 build_native_dpdk -- common/autobuild_common.sh@144 -- $ ./autogen.sh 00:05:09.257 libtoolize: putting auxiliary files in AC_CONFIG_AUX_DIR, 'build-aux'. 00:05:09.257 libtoolize: linking file 'build-aux/ltmain.sh' 00:05:09.257 libtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to configure.ac, 00:05:09.257 libtoolize: and rerunning libtoolize and aclocal. 00:05:09.257 libtoolize: Consider adding '-I m4' to ACLOCAL_AMFLAGS in Makefile.am. 00:05:09.257 configure.ac:53: warning: The macro `AC_PROG_CC_STDC' is obsolete. 00:05:09.257 configure.ac:53: You should run autoupdate. 00:05:09.257 ./lib/autoconf/c.m4:1666: AC_PROG_CC_STDC is expanded from... 00:05:09.257 configure.ac:53: the top level 00:05:09.257 configure.ac:23: installing 'build-aux/compile' 00:05:09.257 configure.ac:25: installing 'build-aux/config.guess' 00:05:09.257 configure.ac:25: installing 'build-aux/config.sub' 00:05:09.257 configure.ac:12: installing 'build-aux/install-sh' 00:05:09.257 configure.ac:12: installing 'build-aux/missing' 00:05:09.257 Makefile.am: installing 'build-aux/depcomp' 00:05:09.257 parallel-tests: installing 'build-aux/test-driver' 00:05:09.257 00:05:09.257 ---------------------------------------------------------------- 00:05:09.257 Initialized build system. For a common configuration please run: 00:05:09.257 ---------------------------------------------------------------- 00:05:09.257 00:05:09.257 ./configure --prefix=/usr --libdir=/usr/lib64 00:05:09.257 00:05:09.257 16:59:01 build_native_dpdk -- common/autobuild_common.sh@145 -- $ ./configure 'CFLAGS=-fPIC -g -O2' --enable-shared=yes --prefix=/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build 00:05:09.257 checking for a BSD-compatible install... /usr/bin/install -c 00:05:09.257 checking whether build environment is sane... yes 00:05:09.257 checking for a race-free mkdir -p... /usr/bin/mkdir -p 00:05:09.257 checking for gawk... gawk 00:05:09.257 checking whether make sets $(MAKE)... yes 00:05:09.257 checking whether make supports nested variables... yes 00:05:09.257 checking how to create a pax tar archive... gnutar 00:05:09.257 checking whether make supports the include directive... yes (GNU style) 00:05:09.257 checking for gcc... gcc 00:05:09.257 checking whether the C compiler works... yes 00:05:09.257 checking for C compiler default output file name... a.out 00:05:09.257 checking for suffix of executables... 00:05:09.257 checking whether we are cross compiling... no 00:05:09.257 checking for suffix of object files... o 00:05:09.257 checking whether the compiler supports GNU C... yes 00:05:09.257 checking whether gcc accepts -g... yes 00:05:09.257 checking for gcc option to enable C11 features... none needed 00:05:09.257 checking whether gcc understands -c and -o together... yes 00:05:09.257 checking dependency style of gcc... gcc3 00:05:09.257 checking dependency style of gcc... gcc3 00:05:09.257 checking build system type... x86_64-pc-linux-gnu 00:05:09.257 checking host system type... x86_64-pc-linux-gnu 00:05:09.257 checking for stdio.h... yes 00:05:09.257 checking for stdlib.h... yes 00:05:09.257 checking for string.h... yes 00:05:09.257 checking for inttypes.h... yes 00:05:09.257 checking for stdint.h... yes 00:05:09.257 checking for strings.h... yes 00:05:09.257 checking for sys/stat.h... yes 00:05:09.257 checking for sys/types.h... yes 00:05:09.257 checking for unistd.h... yes 00:05:09.257 checking for wchar.h... yes 00:05:09.257 checking for minix/config.h... no 00:05:09.257 checking whether it is safe to define __EXTENSIONS__... yes 00:05:09.257 checking whether _XOPEN_SOURCE should be defined... no 00:05:09.257 checking whether make supports nested variables... (cached) yes 00:05:09.257 checking how to print strings... printf 00:05:09.257 checking for a sed that does not truncate output... /usr/bin/sed 00:05:09.257 checking for grep that handles long lines and -e... /usr/bin/grep 00:05:09.257 checking for egrep... /usr/bin/grep -E 00:05:09.257 checking for fgrep... /usr/bin/grep -F 00:05:09.257 checking for ld used by gcc... /usr/bin/ld 00:05:09.257 checking if the linker (/usr/bin/ld) is GNU ld... yes 00:05:09.257 checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B 00:05:09.257 checking the name lister (/usr/bin/nm -B) interface... BSD nm 00:05:09.257 checking whether ln -s works... yes 00:05:09.257 checking the maximum length of command line arguments... 1572864 00:05:09.257 checking how to convert x86_64-pc-linux-gnu file names to x86_64-pc-linux-gnu format... func_convert_file_noop 00:05:09.257 checking how to convert x86_64-pc-linux-gnu file names to toolchain format... func_convert_file_noop 00:05:09.257 checking for /usr/bin/ld option to reload object files... -r 00:05:09.257 checking for file... file 00:05:09.257 checking for objdump... objdump 00:05:09.257 checking how to recognize dependent libraries... pass_all 00:05:09.257 checking for dlltool... no 00:05:09.257 checking how to associate runtime and link libraries... printf %s\n 00:05:09.257 checking for ar... ar 00:05:09.257 checking for archiver @FILE support... @ 00:05:09.257 checking for strip... strip 00:05:09.257 checking for ranlib... ranlib 00:05:09.257 checking command to parse /usr/bin/nm -B output from gcc object... ok 00:05:09.257 checking for sysroot... no 00:05:09.257 checking for a working dd... /usr/bin/dd 00:05:09.257 checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1 00:05:09.257 checking for mt... no 00:05:09.257 checking if : is a manifest tool... no 00:05:09.257 checking for dlfcn.h... yes 00:05:09.257 checking for objdir... .libs 00:05:09.517 checking if gcc supports -fno-rtti -fno-exceptions... no 00:05:09.517 checking for gcc option to produce PIC... -fPIC -DPIC 00:05:09.517 checking if gcc PIC flag -fPIC -DPIC works... yes 00:05:09.776 checking if gcc static flag -static works... yes 00:05:09.776 checking if gcc supports -c -o file.o... yes 00:05:09.776 checking if gcc supports -c -o file.o... (cached) yes 00:05:09.776 checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes 00:05:10.035 checking whether -lc should be explicitly linked in... no 00:05:10.035 checking dynamic linker characteristics... GNU/Linux ld.so 00:05:10.035 checking how to hardcode library paths into programs... immediate 00:05:10.035 checking whether stripping libraries is possible... yes 00:05:10.035 checking if libtool supports shared libraries... yes 00:05:10.035 checking whether to build shared libraries... yes 00:05:10.035 checking whether to build static libraries... yes 00:05:10.035 checking for a sed that does not truncate output... (cached) /usr/bin/sed 00:05:10.035 checking for yasm... yes 00:05:10.035 checking for modern yasm... yes 00:05:10.035 checking for optional yasm AVX512 support... no 00:05:10.035 checking for nasm... yes 00:05:10.035 checking for modern nasm... yes 00:05:10.035 checking for optional nasm AVX512 support... yes 00:05:10.035 checking for additional nasm AVX512 support... yes 00:05:10.035 Using nasm args target "linux" "-f elf64" 00:05:10.294 checking for limits.h... yes 00:05:10.294 checking for stdint.h... (cached) yes 00:05:10.294 checking for stdlib.h... (cached) yes 00:05:10.294 checking for string.h... (cached) yes 00:05:10.294 checking for inline... inline 00:05:10.294 checking for size_t... yes 00:05:10.553 checking for uint16_t... yes 00:05:10.553 checking for uint32_t... yes 00:05:10.553 checking for uint64_t... yes 00:05:10.811 checking for uint8_t... yes 00:05:10.811 checking for GNU libc compatible malloc... yes 00:05:10.811 checking for memmove... yes 00:05:11.070 checking for memset... yes 00:05:11.070 checking for getopt... yes 00:05:11.329 checking that generated files are newer than configure... done 00:05:11.329 configure: creating ./config.status 00:05:12.710 config.status: creating Makefile 00:05:12.710 config.status: creating libisal.pc 00:05:12.710 config.status: executing depfiles commands 00:05:14.089 config.status: executing libtool commands 00:05:14.348 00:05:14.348 isa-l 2.29.0 00:05:14.348 ===== 00:05:14.348 00:05:14.348 prefix: /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build 00:05:14.348 sysconfdir: ${prefix}/etc 00:05:14.348 libdir: ${exec_prefix}/lib 00:05:14.348 includedir: ${prefix}/include 00:05:14.348 00:05:14.348 compiler: gcc 00:05:14.348 cflags: -fPIC -g -O2 00:05:14.348 ldflags: 00:05:14.348 00:05:14.348 debug: no 00:05:14.348 00:05:14.348 16:59:09 build_native_dpdk -- common/autobuild_common.sh@146 -- $ ln -s /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/include /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/isa-l 00:05:14.348 16:59:09 build_native_dpdk -- common/autobuild_common.sh@147 -- $ make -j72 all 00:05:14.348 Building isa-l.h 00:05:14.608 make --no-print-directory all-am 00:05:14.608 CC erasure_code/ec_highlevel_func.lo 00:05:14.608 MKTMP erasure_code/gf_vect_mul_sse.s 00:05:14.608 MKTMP erasure_code/gf_vect_mul_avx.s 00:05:14.608 MKTMP erasure_code/gf_vect_dot_prod_sse.s 00:05:14.608 MKTMP erasure_code/gf_vect_dot_prod_avx.s 00:05:14.608 MKTMP erasure_code/gf_vect_dot_prod_avx2.s 00:05:14.608 MKTMP erasure_code/gf_2vect_dot_prod_sse.s 00:05:14.608 MKTMP erasure_code/gf_3vect_dot_prod_sse.s 00:05:14.608 MKTMP erasure_code/gf_4vect_dot_prod_sse.s 00:05:14.608 MKTMP erasure_code/gf_5vect_dot_prod_sse.s 00:05:14.608 MKTMP erasure_code/gf_6vect_dot_prod_sse.s 00:05:14.608 MKTMP erasure_code/gf_2vect_dot_prod_avx.s 00:05:14.608 MKTMP erasure_code/gf_3vect_dot_prod_avx.s 00:05:14.608 MKTMP erasure_code/gf_5vect_dot_prod_avx.s 00:05:14.608 MKTMP erasure_code/gf_4vect_dot_prod_avx.s 00:05:14.608 MKTMP erasure_code/gf_6vect_dot_prod_avx.s 00:05:14.608 MKTMP erasure_code/gf_2vect_dot_prod_avx2.s 00:05:14.608 MKTMP erasure_code/gf_3vect_dot_prod_avx2.s 00:05:14.608 MKTMP erasure_code/gf_4vect_dot_prod_avx2.s 00:05:14.608 MKTMP erasure_code/gf_5vect_dot_prod_avx2.s 00:05:14.608 MKTMP erasure_code/gf_6vect_dot_prod_avx2.s 00:05:14.608 MKTMP erasure_code/gf_vect_mad_sse.s 00:05:14.608 MKTMP erasure_code/gf_2vect_mad_sse.s 00:05:14.608 MKTMP erasure_code/gf_3vect_mad_sse.s 00:05:14.608 MKTMP erasure_code/gf_4vect_mad_sse.s 00:05:14.608 MKTMP erasure_code/gf_5vect_mad_sse.s 00:05:14.608 MKTMP erasure_code/gf_6vect_mad_sse.s 00:05:14.608 MKTMP erasure_code/gf_vect_mad_avx.s 00:05:14.608 MKTMP erasure_code/gf_2vect_mad_avx.s 00:05:14.608 MKTMP erasure_code/gf_3vect_mad_avx.s 00:05:14.608 MKTMP erasure_code/gf_4vect_mad_avx.s 00:05:14.608 MKTMP erasure_code/gf_5vect_mad_avx.s 00:05:14.608 MKTMP erasure_code/gf_6vect_mad_avx.s 00:05:14.608 MKTMP erasure_code/gf_vect_mad_avx2.s 00:05:14.608 MKTMP erasure_code/gf_2vect_mad_avx2.s 00:05:14.608 MKTMP erasure_code/gf_3vect_mad_avx2.s 00:05:14.608 MKTMP erasure_code/gf_4vect_mad_avx2.s 00:05:14.608 MKTMP erasure_code/gf_5vect_mad_avx2.s 00:05:14.608 MKTMP erasure_code/gf_6vect_mad_avx2.s 00:05:14.608 MKTMP erasure_code/ec_multibinary.s 00:05:14.608 MKTMP erasure_code/gf_vect_dot_prod_avx512.s 00:05:14.608 MKTMP erasure_code/gf_2vect_dot_prod_avx512.s 00:05:14.608 MKTMP erasure_code/gf_3vect_dot_prod_avx512.s 00:05:14.608 MKTMP erasure_code/gf_4vect_dot_prod_avx512.s 00:05:14.608 MKTMP erasure_code/gf_5vect_dot_prod_avx512.s 00:05:14.608 MKTMP erasure_code/gf_6vect_dot_prod_avx512.s 00:05:14.608 MKTMP erasure_code/gf_vect_mad_avx512.s 00:05:14.608 MKTMP erasure_code/gf_2vect_mad_avx512.s 00:05:14.608 MKTMP erasure_code/gf_4vect_mad_avx512.s 00:05:14.608 MKTMP erasure_code/gf_3vect_mad_avx512.s 00:05:14.608 MKTMP erasure_code/gf_5vect_mad_avx512.s 00:05:14.608 MKTMP erasure_code/gf_6vect_mad_avx512.s 00:05:14.608 MKTMP raid/xor_gen_sse.s 00:05:14.608 MKTMP raid/pq_gen_sse.s 00:05:14.608 MKTMP raid/xor_check_sse.s 00:05:14.608 MKTMP raid/pq_check_sse.s 00:05:14.608 MKTMP raid/pq_gen_avx.s 00:05:14.608 MKTMP raid/xor_gen_avx.s 00:05:14.608 MKTMP raid/pq_gen_avx2.s 00:05:14.608 MKTMP raid/xor_gen_avx512.s 00:05:14.608 MKTMP raid/pq_gen_avx512.s 00:05:14.608 MKTMP raid/raid_multibinary.s 00:05:14.608 MKTMP crc/crc16_t10dif_01.s 00:05:14.608 MKTMP crc/crc16_t10dif_by4.s 00:05:14.608 MKTMP crc/crc16_t10dif_02.s 00:05:14.608 MKTMP crc/crc16_t10dif_by16_10.s 00:05:14.608 MKTMP crc/crc16_t10dif_copy_by4.s 00:05:14.608 MKTMP crc/crc16_t10dif_copy_by4_02.s 00:05:14.608 MKTMP crc/crc32_ieee_01.s 00:05:14.608 MKTMP crc/crc32_ieee_02.s 00:05:14.608 MKTMP crc/crc32_ieee_by4.s 00:05:14.608 MKTMP crc/crc32_ieee_by16_10.s 00:05:14.608 MKTMP crc/crc32_iscsi_01.s 00:05:14.608 MKTMP crc/crc32_iscsi_00.s 00:05:14.608 MKTMP crc/crc_multibinary.s 00:05:14.608 MKTMP crc/crc64_multibinary.s 00:05:14.608 MKTMP crc/crc64_ecma_refl_by8.s 00:05:14.608 MKTMP crc/crc64_ecma_norm_by8.s 00:05:14.608 MKTMP crc/crc64_ecma_refl_by16_10.s 00:05:14.608 MKTMP crc/crc64_ecma_norm_by16_10.s 00:05:14.608 MKTMP crc/crc64_iso_refl_by8.s 00:05:14.608 MKTMP crc/crc64_iso_refl_by16_10.s 00:05:14.608 MKTMP crc/crc64_iso_norm_by8.s 00:05:14.608 MKTMP crc/crc64_iso_norm_by16_10.s 00:05:14.608 MKTMP crc/crc64_jones_refl_by8.s 00:05:14.608 MKTMP crc/crc64_jones_refl_by16_10.s 00:05:14.608 MKTMP crc/crc64_jones_norm_by8.s 00:05:14.608 MKTMP crc/crc64_jones_norm_by16_10.s 00:05:14.608 MKTMP crc/crc32_gzip_refl_by8.s 00:05:14.608 MKTMP crc/crc32_gzip_refl_by8_02.s 00:05:14.608 MKTMP crc/crc32_gzip_refl_by16_10.s 00:05:14.608 MKTMP igzip/igzip_body.s 00:05:14.609 MKTMP igzip/igzip_finish.s 00:05:14.609 MKTMP igzip/igzip_icf_body_h1_gr_bt.s 00:05:14.609 MKTMP igzip/igzip_icf_finish.s 00:05:14.609 MKTMP igzip/rfc1951_lookup.s 00:05:14.868 MKTMP igzip/adler32_sse.s 00:05:14.868 MKTMP igzip/adler32_avx2_4.s 00:05:14.868 MKTMP igzip/igzip_multibinary.s 00:05:14.868 MKTMP igzip/igzip_update_histogram_01.s 00:05:14.868 MKTMP igzip/igzip_update_histogram_04.s 00:05:14.868 MKTMP igzip/igzip_decode_block_stateless_01.s 00:05:14.868 MKTMP igzip/igzip_decode_block_stateless_04.s 00:05:14.868 MKTMP igzip/igzip_inflate_multibinary.s 00:05:14.868 MKTMP igzip/encode_df_04.s 00:05:14.868 MKTMP igzip/encode_df_06.s 00:05:14.868 MKTMP igzip/proc_heap.s 00:05:14.868 MKTMP igzip/igzip_deflate_hash.s 00:05:14.868 MKTMP igzip/igzip_gen_icf_map_lh1_06.s 00:05:14.868 MKTMP igzip/igzip_gen_icf_map_lh1_04.s 00:05:14.868 MKTMP igzip/igzip_set_long_icf_fg_04.s 00:05:14.868 MKTMP igzip/igzip_set_long_icf_fg_06.s 00:05:14.868 MKTMP mem/mem_zero_detect_avx.s 00:05:14.868 MKTMP mem/mem_zero_detect_sse.s 00:05:14.868 MKTMP mem/mem_multibinary.s 00:05:14.868 CC programs/igzip_cli.o 00:05:14.868 CC erasure_code/ec_base.lo 00:05:14.868 CC raid/raid_base.lo 00:05:14.868 CC crc/crc_base.lo 00:05:14.868 CC crc/crc64_base.lo 00:05:14.868 CC igzip/igzip.lo 00:05:14.868 CC igzip/hufftables_c.lo 00:05:14.868 CC igzip/igzip_base.lo 00:05:14.868 CC igzip/igzip_icf_base.lo 00:05:14.868 CC igzip/adler32_base.lo 00:05:14.868 CC igzip/encode_df.lo 00:05:14.868 CC igzip/flatten_ll.lo 00:05:14.868 CC igzip/igzip_icf_body.lo 00:05:14.868 CC igzip/huff_codes.lo 00:05:14.868 CC igzip/igzip_inflate.lo 00:05:14.868 CC mem/mem_zero_detect_base.lo 00:05:14.868 CCAS erasure_code/gf_vect_mul_sse.lo 00:05:14.868 CCAS erasure_code/gf_vect_mul_avx.lo 00:05:14.868 CCAS erasure_code/gf_vect_dot_prod_sse.lo 00:05:14.868 CCAS erasure_code/gf_vect_dot_prod_avx.lo 00:05:14.868 CCAS erasure_code/gf_vect_dot_prod_avx2.lo 00:05:14.868 CCAS erasure_code/gf_2vect_dot_prod_sse.lo 00:05:14.868 CCAS erasure_code/gf_4vect_dot_prod_sse.lo 00:05:14.868 CCAS erasure_code/gf_3vect_dot_prod_sse.lo 00:05:14.868 CCAS erasure_code/gf_5vect_dot_prod_sse.lo 00:05:14.868 CCAS erasure_code/gf_6vect_dot_prod_sse.lo 00:05:14.868 CCAS erasure_code/gf_2vect_dot_prod_avx.lo 00:05:14.868 CCAS erasure_code/gf_3vect_dot_prod_avx.lo 00:05:14.868 CCAS erasure_code/gf_4vect_dot_prod_avx.lo 00:05:14.868 CCAS erasure_code/gf_5vect_dot_prod_avx.lo 00:05:14.868 CCAS erasure_code/gf_6vect_dot_prod_avx.lo 00:05:14.868 CCAS erasure_code/gf_2vect_dot_prod_avx2.lo 00:05:14.868 CCAS erasure_code/gf_3vect_dot_prod_avx2.lo 00:05:14.868 CCAS erasure_code/gf_5vect_dot_prod_avx2.lo 00:05:14.868 CCAS erasure_code/gf_4vect_dot_prod_avx2.lo 00:05:14.868 CCAS erasure_code/gf_6vect_dot_prod_avx2.lo 00:05:14.868 CCAS erasure_code/gf_3vect_mad_sse.lo 00:05:14.868 CCAS erasure_code/gf_vect_mad_sse.lo 00:05:14.868 CCAS erasure_code/gf_2vect_mad_sse.lo 00:05:14.868 CCAS erasure_code/gf_4vect_mad_sse.lo 00:05:14.868 CCAS erasure_code/gf_6vect_mad_sse.lo 00:05:14.868 CCAS erasure_code/gf_5vect_mad_sse.lo 00:05:14.868 CCAS erasure_code/gf_vect_mad_avx.lo 00:05:14.868 CCAS erasure_code/gf_3vect_mad_avx.lo 00:05:14.868 CCAS erasure_code/gf_4vect_mad_avx.lo 00:05:14.868 CCAS erasure_code/gf_2vect_mad_avx.lo 00:05:14.868 CCAS erasure_code/gf_5vect_mad_avx.lo 00:05:14.868 CCAS erasure_code/gf_6vect_mad_avx.lo 00:05:14.868 CCAS erasure_code/gf_2vect_mad_avx2.lo 00:05:14.868 CCAS erasure_code/gf_vect_mad_avx2.lo 00:05:14.868 CCAS erasure_code/gf_3vect_mad_avx2.lo 00:05:14.868 CCAS erasure_code/gf_4vect_mad_avx2.lo 00:05:14.868 CCAS erasure_code/gf_5vect_mad_avx2.lo 00:05:14.868 CCAS erasure_code/gf_6vect_mad_avx2.lo 00:05:14.868 CCAS erasure_code/ec_multibinary.lo 00:05:14.868 CCAS erasure_code/gf_vect_dot_prod_avx512.lo 00:05:14.868 CCAS erasure_code/gf_2vect_dot_prod_avx512.lo 00:05:14.868 CCAS erasure_code/gf_3vect_dot_prod_avx512.lo 00:05:14.868 CCAS erasure_code/gf_4vect_dot_prod_avx512.lo 00:05:14.868 CCAS erasure_code/gf_5vect_dot_prod_avx512.lo 00:05:14.868 CCAS erasure_code/gf_6vect_dot_prod_avx512.lo 00:05:14.868 CCAS erasure_code/gf_vect_mad_avx512.lo 00:05:14.868 CCAS erasure_code/gf_2vect_mad_avx512.lo 00:05:14.868 CCAS erasure_code/gf_3vect_mad_avx512.lo 00:05:14.868 CCAS erasure_code/gf_4vect_mad_avx512.lo 00:05:14.868 CCAS erasure_code/gf_5vect_mad_avx512.lo 00:05:14.868 CCAS erasure_code/gf_6vect_mad_avx512.lo 00:05:14.868 CCAS raid/xor_gen_sse.lo 00:05:14.868 CCAS raid/pq_gen_sse.lo 00:05:14.868 CCAS raid/xor_check_sse.lo 00:05:14.868 CCAS raid/pq_check_sse.lo 00:05:14.868 CCAS raid/pq_gen_avx.lo 00:05:14.868 CCAS raid/xor_gen_avx.lo 00:05:14.868 CCAS raid/pq_gen_avx2.lo 00:05:14.868 CCAS raid/xor_gen_avx512.lo 00:05:14.868 CCAS raid/pq_gen_avx512.lo 00:05:14.868 CCAS raid/raid_multibinary.lo 00:05:14.868 CCAS crc/crc16_t10dif_by4.lo 00:05:14.868 CCAS crc/crc16_t10dif_01.lo 00:05:14.868 CCAS crc/crc16_t10dif_02.lo 00:05:14.868 CCAS crc/crc16_t10dif_by16_10.lo 00:05:14.868 CCAS crc/crc16_t10dif_copy_by4.lo 00:05:14.868 CCAS crc/crc16_t10dif_copy_by4_02.lo 00:05:14.868 CCAS crc/crc32_ieee_01.lo 00:05:14.868 CCAS crc/crc32_ieee_02.lo 00:05:14.868 CCAS crc/crc32_ieee_by4.lo 00:05:14.868 CCAS crc/crc32_ieee_by16_10.lo 00:05:14.868 CCAS crc/crc32_iscsi_01.lo 00:05:14.868 CCAS crc/crc32_iscsi_00.lo 00:05:14.868 CCAS crc/crc_multibinary.lo 00:05:14.868 CCAS crc/crc64_multibinary.lo 00:05:14.868 CCAS crc/crc64_ecma_refl_by8.lo 00:05:14.868 CCAS crc/crc64_ecma_refl_by16_10.lo 00:05:14.868 CCAS crc/crc64_ecma_norm_by8.lo 00:05:15.127 CCAS crc/crc64_iso_refl_by8.lo 00:05:15.127 CCAS crc/crc64_iso_refl_by16_10.lo 00:05:15.127 CCAS crc/crc64_iso_norm_by8.lo 00:05:15.127 CCAS crc/crc64_ecma_norm_by16_10.lo 00:05:15.127 CCAS crc/crc64_iso_norm_by16_10.lo 00:05:15.127 CCAS crc/crc64_jones_refl_by8.lo 00:05:15.127 CCAS crc/crc64_jones_refl_by16_10.lo 00:05:15.127 CCAS crc/crc64_jones_norm_by8.lo 00:05:15.127 CCAS crc/crc32_gzip_refl_by8.lo 00:05:15.127 CCAS crc/crc64_jones_norm_by16_10.lo 00:05:15.127 CCAS crc/crc32_gzip_refl_by8_02.lo 00:05:15.127 CCAS crc/crc32_gzip_refl_by16_10.lo 00:05:15.127 CCAS igzip/igzip_body.lo 00:05:15.127 CCAS igzip/igzip_finish.lo 00:05:15.127 CCAS igzip/igzip_icf_body_h1_gr_bt.lo 00:05:15.127 CCAS igzip/igzip_icf_finish.lo 00:05:15.127 CCAS igzip/rfc1951_lookup.lo 00:05:15.127 CCAS igzip/adler32_sse.lo 00:05:15.127 CCAS igzip/adler32_avx2_4.lo 00:05:15.127 CCAS igzip/igzip_multibinary.lo 00:05:15.127 CCAS igzip/igzip_update_histogram_01.lo 00:05:15.127 CCAS igzip/igzip_update_histogram_04.lo 00:05:15.127 CCAS igzip/igzip_decode_block_stateless_01.lo 00:05:15.127 CCAS igzip/igzip_decode_block_stateless_04.lo 00:05:15.127 CCAS igzip/igzip_inflate_multibinary.lo 00:05:15.127 CCAS igzip/encode_df_04.lo 00:05:15.127 CCAS igzip/encode_df_06.lo 00:05:15.127 CCAS igzip/proc_heap.lo 00:05:15.127 CCAS igzip/igzip_deflate_hash.lo 00:05:15.127 CCAS igzip/igzip_gen_icf_map_lh1_06.lo 00:05:15.127 CCAS igzip/igzip_gen_icf_map_lh1_04.lo 00:05:15.127 CCAS igzip/igzip_set_long_icf_fg_04.lo 00:05:15.127 CCAS igzip/igzip_set_long_icf_fg_06.lo 00:05:15.127 CCAS mem/mem_zero_detect_avx.lo 00:05:15.127 CCAS mem/mem_zero_detect_sse.lo 00:05:15.127 CCAS mem/mem_multibinary.lo 00:05:21.696 CCLD libisal.la 00:05:21.955 CCLD programs/igzip 00:05:22.214 rm erasure_code/gf_5vect_dot_prod_avx512.s erasure_code/gf_3vect_mad_avx.s erasure_code/gf_5vect_dot_prod_avx2.s erasure_code/gf_6vect_dot_prod_avx.s crc/crc16_t10dif_01.s crc/crc32_iscsi_00.s erasure_code/gf_5vect_dot_prod_avx.s igzip/encode_df_04.s erasure_code/gf_6vect_mad_sse.s erasure_code/gf_4vect_dot_prod_sse.s erasure_code/gf_5vect_mad_avx512.s crc/crc16_t10dif_copy_by4.s erasure_code/gf_5vect_mad_avx2.s erasure_code/gf_vect_mad_avx2.s igzip/proc_heap.s erasure_code/gf_3vect_dot_prod_sse.s igzip/igzip_set_long_icf_fg_06.s crc/crc64_jones_refl_by8.s erasure_code/gf_vect_dot_prod_avx2.s igzip/encode_df_06.s crc/crc_multibinary.s erasure_code/gf_4vect_mad_avx512.s erasure_code/gf_2vect_mad_avx2.s erasure_code/gf_4vect_mad_avx.s igzip/igzip_set_long_icf_fg_04.s crc/crc64_iso_refl_by8.s crc/crc16_t10dif_by16_10.s erasure_code/gf_2vect_dot_prod_avx2.s igzip/igzip_gen_icf_map_lh1_04.s raid/xor_check_sse.s erasure_code/gf_5vect_mad_avx.s raid/pq_gen_sse.s erasure_code/gf_vect_mad_avx.s erasure_code/gf_5vect_dot_prod_sse.s erasure_code/ec_multibinary.s crc/crc64_iso_norm_by16_10.s igzip/rfc1951_lookup.s raid/pq_gen_avx2.s erasure_code/gf_6vect_mad_avx.s crc/crc32_gzip_refl_by8.s igzip/igzip_gen_icf_map_lh1_06.s erasure_code/gf_3vect_dot_prod_avx2.s erasure_code/gf_2vect_mad_avx512.s igzip/igzip_update_histogram_04.s crc/crc64_ecma_norm_by16_10.s crc/crc32_ieee_by4.s erasure_code/gf_4vect_dot_prod_avx.s crc/crc16_t10dif_02.s erasure_code/gf_2vect_mad_sse.s raid/xor_gen_sse.s erasure_code/gf_5vect_mad_sse.s erasure_code/gf_3vect_dot_prod_avx512.s erasure_code/gf_3vect_mad_avx512.s raid/pq_gen_avx.s erasure_code/gf_2vect_dot_prod_sse.s igzip/igzip_multibinary.s igzip/igzip_deflate_hash.s erasure_code/gf_vect_mad_avx512.s raid/pq_gen_avx512.s igzip/adler32_sse.s crc/crc32_iscsi_01.s crc/crc16_t10dif_by4.s erasure_code/gf_6vect_dot_prod_avx2.s crc/crc32_gzip_refl_by16_10.s raid/xor_gen_avx512.s erasure_code/gf_vect_dot_prod_avx.s igzip/igzip_icf_finish.s erasure_code/gf_vect_mad_sse.s erasure_code/gf_vect_mul_sse.s erasure_code/gf_6vect_mad_avx512.s igzip/igzip_decode_block_stateless_04.s erasure_code/gf_6vect_mad_avx2.s crc/crc64_ecma_refl_by16_10.s raid/xor_gen_avx.s erasure_code/gf_6vect_dot_prod_avx512.s erasure_code/gf_2vect_mad_avx.s erasure_code/gf_2vect_dot_prod_avx512.s crc/crc32_ieee_by16_10.s crc/crc64_iso_refl_by16_10.s erasure_code/gf_3vect_mad_sse.s raid/pq_check_sse.s erasure_code/gf_2vect_dot_prod_avx.s mem/mem_zero_detect_avx.s crc/crc32_ieee_01.s crc/crc64_jones_refl_by16_10.s crc/crc64_multibinary.s mem/mem_multibinary.s raid/raid_multibinary.s erasure_code/gf_3vect_dot_prod_avx.s crc/crc32_ieee_02.s mem/mem_zero_detect_sse.s igzip/igzip_decode_block_stateless_01.s erasure_code/gf_4vect_dot_prod_avx2.s crc/crc32_gzip_refl_by8_02.s igzip/igzip_finish.s erasure_code/gf_4vect_mad_avx2.s crc/crc16_t10dif_copy_by4_02.s erasure_code/gf_vect_dot_prod_sse.s erasure_code/gf_3vect_mad_avx2.s erasure_code/gf_vect_mul_avx.s igzip/adler32_avx2_4.s erasure_code/gf_4vect_mad_sse.s igzip/igzip_inflate_multibinary.s crc/crc64_ecma_norm_by8.s igzip/igzip_body.s erasure_code/gf_6vect_dot_prod_sse.s crc/crc64_jones_norm_by16_10.s crc/crc64_iso_norm_by8.s crc/crc64_jones_norm_by8.s erasure_code/gf_4vect_dot_prod_avx512.s crc/crc64_ecma_refl_by8.s igzip/igzip_update_histogram_01.s igzip/igzip_icf_body_h1_gr_bt.s erasure_code/gf_vect_dot_prod_avx512.s 00:05:22.214 16:59:17 build_native_dpdk -- common/autobuild_common.sh@148 -- $ make install 00:05:22.214 make --no-print-directory install-am 00:05:22.473 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib' 00:05:22.473 /bin/sh ./libtool --mode=install /usr/bin/install -c libisal.la '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib' 00:05:22.473 libtool: install: /usr/bin/install -c .libs/libisal.so.2.0.29 /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.so.2.0.29 00:05:22.473 libtool: install: (cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib && { ln -s -f libisal.so.2.0.29 libisal.so.2 || { rm -f libisal.so.2 && ln -s libisal.so.2.0.29 libisal.so.2; }; }) 00:05:22.473 libtool: install: (cd /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib && { ln -s -f libisal.so.2.0.29 libisal.so || { rm -f libisal.so && ln -s libisal.so.2.0.29 libisal.so; }; }) 00:05:22.473 libtool: install: /usr/bin/install -c .libs/libisal.lai /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.la 00:05:22.473 libtool: install: /usr/bin/install -c .libs/libisal.a /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.a 00:05:22.473 libtool: install: chmod 644 /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.a 00:05:22.473 libtool: install: ranlib /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/libisal.a 00:05:22.732 libtool: finish: PATH="/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin:/sbin" ldconfig -n /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:05:22.733 ---------------------------------------------------------------------- 00:05:22.733 Libraries have been installed in: 00:05:22.733 /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:05:22.733 00:05:22.733 If you ever happen to want to link against installed libraries 00:05:22.733 in a given directory, LIBDIR, you must either use libtool, and 00:05:22.733 specify the full pathname of the library, or use the '-LLIBDIR' 00:05:22.733 flag during linking and do at least one of the following: 00:05:22.733 - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable 00:05:22.733 during execution 00:05:22.733 - add LIBDIR to the 'LD_RUN_PATH' environment variable 00:05:22.733 during linking 00:05:22.733 - use the '-Wl,-rpath -Wl,LIBDIR' linker flag 00:05:22.733 - have your system administrator add LIBDIR to '/etc/ld.so.conf' 00:05:22.733 00:05:22.733 See any operating system documentation about shared libraries for 00:05:22.733 more information, such as the ld(1) and ld.so(8) manual pages. 00:05:22.733 ---------------------------------------------------------------------- 00:05:22.733 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/bin' 00:05:22.733 /bin/sh ./libtool --mode=install /usr/bin/install -c programs/igzip '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/bin' 00:05:22.733 libtool: install: /usr/bin/install -c programs/.libs/igzip /var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/bin/igzip 00:05:22.993 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/share/man/man1' 00:05:22.993 /usr/bin/install -c -m 644 programs/igzip.1 '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/share/man/man1' 00:05:22.993 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include' 00:05:22.993 /usr/bin/install -c -m 644 isa-l.h '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include/.' 00:05:22.993 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig' 00:05:22.993 /usr/bin/install -c -m 644 libisal.pc '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig' 00:05:22.993 /usr/bin/mkdir -p '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include/isa-l' 00:05:22.993 /usr/bin/install -c -m 644 include/test.h include/types.h include/crc.h include/crc64.h include/erasure_code.h include/gf_vect_mul.h include/igzip_lib.h include/mem_routines.h include/raid.h '/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/include/isa-l' 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@149 -- $ DPDK_DRIVERS+=("compress") 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@150 -- $ DPDK_DRIVERS+=("compress/isal") 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@151 -- $ DPDK_DRIVERS+=("compress/qat") 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@152 -- $ DPDK_DRIVERS+=("common/qat") 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@153 -- $ ge 22.11.4 21.02.0 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '>=' 21.02.0 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=>=' 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 22 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@350 -- $ local d=22 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@352 -- $ echo 22 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=22 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:05:22.993 16:59:18 build_native_dpdk -- scripts/common.sh@364 -- $ return 0 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@156 -- $ test y = n 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@161 -- $ DPDK_DRIVERS+=("compress/mlx5") 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@163 -- $ export PKG_CONFIG_PATH=:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig 00:05:22.993 16:59:18 build_native_dpdk -- common/autobuild_common.sh@163 -- $ PKG_CONFIG_PATH=:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib/pkgconfig 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@164 -- $ export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@164 -- $ LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/dpdk 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@370 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=<' 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@342 -- $ : 1 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@350 -- $ local d=22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@352 -- $ echo 22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 21 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@350 -- $ local d=21 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@352 -- $ echo 21 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=21 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@364 -- $ return 1 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:05:22.994 patching file config/rte_config.h 00:05:22.994 Hunk #1 succeeded at 60 (offset 1 line). 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@370 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@333 -- $ IFS=.-: 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@333 -- $ read -ra ver1 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@334 -- $ IFS=.-: 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@334 -- $ read -ra ver2 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@335 -- $ local 'op=<' 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@337 -- $ ver1_l=3 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@338 -- $ ver2_l=3 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@341 -- $ case "$op" in 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@342 -- $ : 1 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@361 -- $ (( v = 0 )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@362 -- $ decimal 22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@350 -- $ local d=22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@352 -- $ echo 22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@362 -- $ ver1[v]=22 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@363 -- $ decimal 24 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@350 -- $ local d=24 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@351 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@352 -- $ echo 24 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@363 -- $ ver2[v]=24 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@365 -- $ (( ver1[v] < ver2[v] )) 00:05:22.994 16:59:18 build_native_dpdk -- scripts/common.sh@365 -- $ return 0 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:05:22.994 patching file lib/pcapng/rte_pcapng.c 00:05:22.994 Hunk #1 succeeded at 110 (offset -18 lines). 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@181 -- $ uname -s 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base crypto crypto/ipsec_mb crypto/qat compress/qat common/qat bus/auxiliary common/mlx5 common/mlx5/linux crypto/mlx5 compress compress/isal compress/qat common/qat compress/mlx5 00:05:22.994 16:59:18 build_native_dpdk -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false '-Dc_link_args= -L/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow -I/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,crypto,crypto/ipsec_mb,crypto/qat,compress/qat,common/qat,bus/auxiliary,common/mlx5,common/mlx5/linux,crypto/mlx5,compress,compress/isal,compress/qat,common/qat,compress/mlx5, 00:05:29.565 The Meson build system 00:05:29.565 Version: 1.3.1 00:05:29.565 Source dir: /var/jenkins/workspace/crypto-phy-autotest/dpdk 00:05:29.565 Build dir: /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp 00:05:29.565 Build type: native build 00:05:29.565 Program cat found: YES (/usr/bin/cat) 00:05:29.565 Project name: DPDK 00:05:29.565 Project version: 22.11.4 00:05:29.565 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:05:29.565 C linker for the host machine: gcc ld.bfd 2.39-16 00:05:29.565 Host machine cpu family: x86_64 00:05:29.565 Host machine cpu: x86_64 00:05:29.565 Message: ## Building in Developer Mode ## 00:05:29.565 Program pkg-config found: YES (/usr/bin/pkg-config) 00:05:29.565 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:05:29.565 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:05:29.565 Program objdump found: YES (/usr/bin/objdump) 00:05:29.565 Program python3 found: YES (/usr/bin/python3) 00:05:29.565 Program cat found: YES (/usr/bin/cat) 00:05:29.565 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:05:29.565 Checking for size of "void *" : 8 00:05:29.565 Checking for size of "void *" : 8 (cached) 00:05:29.565 Library m found: YES 00:05:29.565 Library numa found: YES 00:05:29.565 Has header "numaif.h" : YES 00:05:29.565 Library fdt found: NO 00:05:29.565 Library execinfo found: NO 00:05:29.565 Has header "execinfo.h" : YES 00:05:29.565 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:05:29.565 Run-time dependency libarchive found: NO (tried pkgconfig) 00:05:29.565 Run-time dependency libbsd found: NO (tried pkgconfig) 00:05:29.565 Run-time dependency jansson found: NO (tried pkgconfig) 00:05:29.565 Run-time dependency openssl found: YES 3.0.9 00:05:29.565 Run-time dependency libpcap found: YES 1.10.4 00:05:29.565 Has header "pcap.h" with dependency libpcap: YES 00:05:29.565 Compiler for C supports arguments -Wcast-qual: YES 00:05:29.565 Compiler for C supports arguments -Wdeprecated: YES 00:05:29.565 Compiler for C supports arguments -Wformat: YES 00:05:29.565 Compiler for C supports arguments -Wformat-nonliteral: NO 00:05:29.565 Compiler for C supports arguments -Wformat-security: NO 00:05:29.565 Compiler for C supports arguments -Wmissing-declarations: YES 00:05:29.565 Compiler for C supports arguments -Wmissing-prototypes: YES 00:05:29.565 Compiler for C supports arguments -Wnested-externs: YES 00:05:29.565 Compiler for C supports arguments -Wold-style-definition: YES 00:05:29.565 Compiler for C supports arguments -Wpointer-arith: YES 00:05:29.565 Compiler for C supports arguments -Wsign-compare: YES 00:05:29.565 Compiler for C supports arguments -Wstrict-prototypes: YES 00:05:29.565 Compiler for C supports arguments -Wundef: YES 00:05:29.565 Compiler for C supports arguments -Wwrite-strings: YES 00:05:29.565 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:05:29.565 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:05:29.565 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:05:29.565 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:05:29.565 Compiler for C supports arguments -mavx512f: YES 00:05:29.565 Checking if "AVX512 checking" compiles: YES 00:05:29.565 Fetching value of define "__SSE4_2__" : 1 00:05:29.565 Fetching value of define "__AES__" : 1 00:05:29.565 Fetching value of define "__AVX__" : 1 00:05:29.565 Fetching value of define "__AVX2__" : 1 00:05:29.565 Fetching value of define "__AVX512BW__" : 1 00:05:29.565 Fetching value of define "__AVX512CD__" : 1 00:05:29.565 Fetching value of define "__AVX512DQ__" : 1 00:05:29.565 Fetching value of define "__AVX512F__" : 1 00:05:29.565 Fetching value of define "__AVX512VL__" : 1 00:05:29.565 Fetching value of define "__PCLMUL__" : 1 00:05:29.565 Fetching value of define "__RDRND__" : 1 00:05:29.565 Fetching value of define "__RDSEED__" : 1 00:05:29.565 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:05:29.565 Compiler for C supports arguments -Wno-format-truncation: YES 00:05:29.565 Message: lib/kvargs: Defining dependency "kvargs" 00:05:29.565 Message: lib/telemetry: Defining dependency "telemetry" 00:05:29.566 Checking for function "getentropy" : YES 00:05:29.566 Message: lib/eal: Defining dependency "eal" 00:05:29.566 Message: lib/ring: Defining dependency "ring" 00:05:29.566 Message: lib/rcu: Defining dependency "rcu" 00:05:29.566 Message: lib/mempool: Defining dependency "mempool" 00:05:29.566 Message: lib/mbuf: Defining dependency "mbuf" 00:05:29.566 Fetching value of define "__PCLMUL__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512F__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512BW__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512VL__" : 1 (cached) 00:05:29.566 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:05:29.566 Compiler for C supports arguments -mpclmul: YES 00:05:29.566 Compiler for C supports arguments -maes: YES 00:05:29.566 Compiler for C supports arguments -mavx512f: YES (cached) 00:05:29.566 Compiler for C supports arguments -mavx512bw: YES 00:05:29.566 Compiler for C supports arguments -mavx512dq: YES 00:05:29.566 Compiler for C supports arguments -mavx512vl: YES 00:05:29.566 Compiler for C supports arguments -mvpclmulqdq: YES 00:05:29.566 Compiler for C supports arguments -mavx2: YES 00:05:29.566 Compiler for C supports arguments -mavx: YES 00:05:29.566 Message: lib/net: Defining dependency "net" 00:05:29.566 Message: lib/meter: Defining dependency "meter" 00:05:29.566 Message: lib/ethdev: Defining dependency "ethdev" 00:05:29.566 Message: lib/pci: Defining dependency "pci" 00:05:29.566 Message: lib/cmdline: Defining dependency "cmdline" 00:05:29.566 Message: lib/metrics: Defining dependency "metrics" 00:05:29.566 Message: lib/hash: Defining dependency "hash" 00:05:29.566 Message: lib/timer: Defining dependency "timer" 00:05:29.566 Fetching value of define "__AVX2__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512F__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512VL__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512CD__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512BW__" : 1 (cached) 00:05:29.566 Message: lib/acl: Defining dependency "acl" 00:05:29.566 Message: lib/bbdev: Defining dependency "bbdev" 00:05:29.566 Message: lib/bitratestats: Defining dependency "bitratestats" 00:05:29.566 Run-time dependency libelf found: YES 0.190 00:05:29.566 Message: lib/bpf: Defining dependency "bpf" 00:05:29.566 Message: lib/cfgfile: Defining dependency "cfgfile" 00:05:29.566 Message: lib/compressdev: Defining dependency "compressdev" 00:05:29.566 Message: lib/cryptodev: Defining dependency "cryptodev" 00:05:29.566 Message: lib/distributor: Defining dependency "distributor" 00:05:29.566 Message: lib/efd: Defining dependency "efd" 00:05:29.566 Message: lib/eventdev: Defining dependency "eventdev" 00:05:29.566 Message: lib/gpudev: Defining dependency "gpudev" 00:05:29.566 Message: lib/gro: Defining dependency "gro" 00:05:29.566 Message: lib/gso: Defining dependency "gso" 00:05:29.566 Message: lib/ip_frag: Defining dependency "ip_frag" 00:05:29.566 Message: lib/jobstats: Defining dependency "jobstats" 00:05:29.566 Message: lib/latencystats: Defining dependency "latencystats" 00:05:29.566 Message: lib/lpm: Defining dependency "lpm" 00:05:29.566 Fetching value of define "__AVX512F__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512IFMA__" : (undefined) 00:05:29.566 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:05:29.566 Message: lib/member: Defining dependency "member" 00:05:29.566 Message: lib/pcapng: Defining dependency "pcapng" 00:05:29.566 Compiler for C supports arguments -Wno-cast-qual: YES 00:05:29.566 Message: lib/power: Defining dependency "power" 00:05:29.566 Message: lib/rawdev: Defining dependency "rawdev" 00:05:29.566 Message: lib/regexdev: Defining dependency "regexdev" 00:05:29.566 Message: lib/dmadev: Defining dependency "dmadev" 00:05:29.566 Message: lib/rib: Defining dependency "rib" 00:05:29.566 Message: lib/reorder: Defining dependency "reorder" 00:05:29.566 Message: lib/sched: Defining dependency "sched" 00:05:29.566 Message: lib/security: Defining dependency "security" 00:05:29.566 Message: lib/stack: Defining dependency "stack" 00:05:29.566 Has header "linux/userfaultfd.h" : YES 00:05:29.566 Message: lib/vhost: Defining dependency "vhost" 00:05:29.566 Message: lib/ipsec: Defining dependency "ipsec" 00:05:29.566 Fetching value of define "__AVX512F__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:05:29.566 Fetching value of define "__AVX512BW__" : 1 (cached) 00:05:29.566 Message: lib/fib: Defining dependency "fib" 00:05:29.566 Message: lib/port: Defining dependency "port" 00:05:29.566 Message: lib/pdump: Defining dependency "pdump" 00:05:29.566 Message: lib/table: Defining dependency "table" 00:05:29.566 Message: lib/pipeline: Defining dependency "pipeline" 00:05:29.566 Message: lib/graph: Defining dependency "graph" 00:05:29.566 Message: lib/node: Defining dependency "node" 00:05:29.566 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:05:29.566 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:05:29.566 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:05:29.566 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:05:29.566 Compiler for C supports arguments -std=c11: YES 00:05:29.566 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:05:29.566 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:05:29.566 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:05:29.566 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:05:33.811 Run-time dependency libmlx5 found: YES 1.24.44.0 00:05:33.811 Run-time dependency libibverbs found: YES 1.14.44.0 00:05:33.811 Library mtcr_ul found: NO 00:05:33.811 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:05:33.811 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_25000baseCR_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_50000baseCR2_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_100000baseKR4_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:05:33.811 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:05:33.812 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:05:37.109 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:05:37.109 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:05:37.109 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:05:37.109 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:05:37.109 Configuring mlx5_autoconf.h using configuration 00:05:37.109 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:05:37.109 Run-time dependency libcrypto found: YES 3.0.9 00:05:37.109 Library IPSec_MB found: YES 00:05:37.109 Dependency libcrypto found: YES 3.0.9 (cached) 00:05:37.109 Fetching value of define "IMB_VERSION_STR" : "1.0.0" 00:05:37.109 Message: drivers/common/qat: Defining dependency "common_qat" 00:05:37.109 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:05:37.109 Compiler for C supports arguments -Wno-sign-compare: YES 00:05:37.109 Compiler for C supports arguments -Wno-unused-value: YES 00:05:37.109 Compiler for C supports arguments -Wno-format: YES 00:05:37.109 Compiler for C supports arguments -Wno-format-security: YES 00:05:37.109 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:05:37.109 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:05:37.109 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:05:37.109 Compiler for C supports arguments -Wno-unused-parameter: YES 00:05:37.109 Fetching value of define "__AVX2__" : 1 (cached) 00:05:37.109 Fetching value of define "__AVX512F__" : 1 (cached) 00:05:37.109 Fetching value of define "__AVX512BW__" : 1 (cached) 00:05:37.109 Compiler for C supports arguments -mavx512f: YES (cached) 00:05:37.109 Compiler for C supports arguments -mavx512bw: YES (cached) 00:05:37.109 Compiler for C supports arguments -march=skylake-avx512: YES 00:05:37.109 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:05:37.109 Library IPSec_MB found: YES 00:05:37.109 Fetching value of define "IMB_VERSION_STR" : "1.0.0" (cached) 00:05:37.109 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:05:37.109 Compiler for C supports arguments -std=c11: YES (cached) 00:05:37.109 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:05:37.109 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:05:37.109 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:05:37.109 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:05:37.109 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:05:37.109 Run-time dependency libisal found: YES 2.29.0 00:05:37.109 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:05:37.109 Compiler for C supports arguments -std=c11: YES (cached) 00:05:37.109 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:05:37.109 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:05:37.109 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:05:37.109 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:05:37.109 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:05:37.109 Program doxygen found: YES (/usr/bin/doxygen) 00:05:37.109 Configuring doxy-api.conf using configuration 00:05:37.109 Program sphinx-build found: NO 00:05:37.109 Configuring rte_build_config.h using configuration 00:05:37.109 Message: 00:05:37.109 ================= 00:05:37.109 Applications Enabled 00:05:37.109 ================= 00:05:37.109 00:05:37.109 apps: 00:05:37.109 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:05:37.109 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:05:37.109 test-security-perf, 00:05:37.109 00:05:37.109 Message: 00:05:37.109 ================= 00:05:37.109 Libraries Enabled 00:05:37.109 ================= 00:05:37.109 00:05:37.109 libs: 00:05:37.109 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:05:37.109 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:05:37.109 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:05:37.109 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:05:37.109 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:05:37.109 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:05:37.109 table, pipeline, graph, node, 00:05:37.109 00:05:37.109 Message: 00:05:37.109 =============== 00:05:37.109 Drivers Enabled 00:05:37.109 =============== 00:05:37.109 00:05:37.109 common: 00:05:37.109 mlx5, qat, 00:05:37.109 bus: 00:05:37.109 auxiliary, pci, vdev, 00:05:37.109 mempool: 00:05:37.109 ring, 00:05:37.109 dma: 00:05:37.109 00:05:37.109 net: 00:05:37.109 i40e, 00:05:37.109 raw: 00:05:37.109 00:05:37.109 crypto: 00:05:37.109 ipsec_mb, mlx5, 00:05:37.109 compress: 00:05:37.109 isal, mlx5, 00:05:37.109 regex: 00:05:37.109 00:05:37.109 vdpa: 00:05:37.109 00:05:37.109 event: 00:05:37.109 00:05:37.109 baseband: 00:05:37.109 00:05:37.109 gpu: 00:05:37.109 00:05:37.109 00:05:37.109 Message: 00:05:37.109 ================= 00:05:37.109 Content Skipped 00:05:37.109 ================= 00:05:37.109 00:05:37.109 apps: 00:05:37.109 00:05:37.109 libs: 00:05:37.109 kni: explicitly disabled via build config (deprecated lib) 00:05:37.109 flow_classify: explicitly disabled via build config (deprecated lib) 00:05:37.109 00:05:37.109 drivers: 00:05:37.109 common/cpt: not in enabled drivers build config 00:05:37.109 common/dpaax: not in enabled drivers build config 00:05:37.110 common/iavf: not in enabled drivers build config 00:05:37.110 common/idpf: not in enabled drivers build config 00:05:37.110 common/mvep: not in enabled drivers build config 00:05:37.110 common/octeontx: not in enabled drivers build config 00:05:37.110 bus/dpaa: not in enabled drivers build config 00:05:37.110 bus/fslmc: not in enabled drivers build config 00:05:37.110 bus/ifpga: not in enabled drivers build config 00:05:37.110 bus/vmbus: not in enabled drivers build config 00:05:37.110 common/cnxk: not in enabled drivers build config 00:05:37.110 common/sfc_efx: not in enabled drivers build config 00:05:37.110 mempool/bucket: not in enabled drivers build config 00:05:37.110 mempool/cnxk: not in enabled drivers build config 00:05:37.110 mempool/dpaa: not in enabled drivers build config 00:05:37.110 mempool/dpaa2: not in enabled drivers build config 00:05:37.110 mempool/octeontx: not in enabled drivers build config 00:05:37.110 mempool/stack: not in enabled drivers build config 00:05:37.110 dma/cnxk: not in enabled drivers build config 00:05:37.110 dma/dpaa: not in enabled drivers build config 00:05:37.110 dma/dpaa2: not in enabled drivers build config 00:05:37.110 dma/hisilicon: not in enabled drivers build config 00:05:37.110 dma/idxd: not in enabled drivers build config 00:05:37.110 dma/ioat: not in enabled drivers build config 00:05:37.110 dma/skeleton: not in enabled drivers build config 00:05:37.110 net/af_packet: not in enabled drivers build config 00:05:37.110 net/af_xdp: not in enabled drivers build config 00:05:37.110 net/ark: not in enabled drivers build config 00:05:37.110 net/atlantic: not in enabled drivers build config 00:05:37.110 net/avp: not in enabled drivers build config 00:05:37.110 net/axgbe: not in enabled drivers build config 00:05:37.110 net/bnx2x: not in enabled drivers build config 00:05:37.110 net/bnxt: not in enabled drivers build config 00:05:37.110 net/bonding: not in enabled drivers build config 00:05:37.110 net/cnxk: not in enabled drivers build config 00:05:37.110 net/cxgbe: not in enabled drivers build config 00:05:37.110 net/dpaa: not in enabled drivers build config 00:05:37.110 net/dpaa2: not in enabled drivers build config 00:05:37.110 net/e1000: not in enabled drivers build config 00:05:37.110 net/ena: not in enabled drivers build config 00:05:37.110 net/enetc: not in enabled drivers build config 00:05:37.110 net/enetfec: not in enabled drivers build config 00:05:37.110 net/enic: not in enabled drivers build config 00:05:37.110 net/failsafe: not in enabled drivers build config 00:05:37.110 net/fm10k: not in enabled drivers build config 00:05:37.110 net/gve: not in enabled drivers build config 00:05:37.110 net/hinic: not in enabled drivers build config 00:05:37.110 net/hns3: not in enabled drivers build config 00:05:37.110 net/iavf: not in enabled drivers build config 00:05:37.110 net/ice: not in enabled drivers build config 00:05:37.110 net/idpf: not in enabled drivers build config 00:05:37.110 net/igc: not in enabled drivers build config 00:05:37.110 net/ionic: not in enabled drivers build config 00:05:37.110 net/ipn3ke: not in enabled drivers build config 00:05:37.110 net/ixgbe: not in enabled drivers build config 00:05:37.110 net/kni: not in enabled drivers build config 00:05:37.110 net/liquidio: not in enabled drivers build config 00:05:37.110 net/mana: not in enabled drivers build config 00:05:37.110 net/memif: not in enabled drivers build config 00:05:37.110 net/mlx4: not in enabled drivers build config 00:05:37.110 net/mlx5: not in enabled drivers build config 00:05:37.110 net/mvneta: not in enabled drivers build config 00:05:37.110 net/mvpp2: not in enabled drivers build config 00:05:37.110 net/netvsc: not in enabled drivers build config 00:05:37.110 net/nfb: not in enabled drivers build config 00:05:37.110 net/nfp: not in enabled drivers build config 00:05:37.110 net/ngbe: not in enabled drivers build config 00:05:37.110 net/null: not in enabled drivers build config 00:05:37.110 net/octeontx: not in enabled drivers build config 00:05:37.110 net/octeon_ep: not in enabled drivers build config 00:05:37.110 net/pcap: not in enabled drivers build config 00:05:37.110 net/pfe: not in enabled drivers build config 00:05:37.110 net/qede: not in enabled drivers build config 00:05:37.110 net/ring: not in enabled drivers build config 00:05:37.110 net/sfc: not in enabled drivers build config 00:05:37.110 net/softnic: not in enabled drivers build config 00:05:37.110 net/tap: not in enabled drivers build config 00:05:37.110 net/thunderx: not in enabled drivers build config 00:05:37.110 net/txgbe: not in enabled drivers build config 00:05:37.110 net/vdev_netvsc: not in enabled drivers build config 00:05:37.110 net/vhost: not in enabled drivers build config 00:05:37.110 net/virtio: not in enabled drivers build config 00:05:37.110 net/vmxnet3: not in enabled drivers build config 00:05:37.110 raw/cnxk_bphy: not in enabled drivers build config 00:05:37.110 raw/cnxk_gpio: not in enabled drivers build config 00:05:37.110 raw/dpaa2_cmdif: not in enabled drivers build config 00:05:37.110 raw/ifpga: not in enabled drivers build config 00:05:37.110 raw/ntb: not in enabled drivers build config 00:05:37.110 raw/skeleton: not in enabled drivers build config 00:05:37.110 crypto/armv8: not in enabled drivers build config 00:05:37.110 crypto/bcmfs: not in enabled drivers build config 00:05:37.110 crypto/caam_jr: not in enabled drivers build config 00:05:37.110 crypto/ccp: not in enabled drivers build config 00:05:37.110 crypto/cnxk: not in enabled drivers build config 00:05:37.110 crypto/dpaa_sec: not in enabled drivers build config 00:05:37.110 crypto/dpaa2_sec: not in enabled drivers build config 00:05:37.110 crypto/mvsam: not in enabled drivers build config 00:05:37.110 crypto/nitrox: not in enabled drivers build config 00:05:37.110 crypto/null: not in enabled drivers build config 00:05:37.110 crypto/octeontx: not in enabled drivers build config 00:05:37.110 crypto/openssl: not in enabled drivers build config 00:05:37.110 crypto/scheduler: not in enabled drivers build config 00:05:37.110 crypto/uadk: not in enabled drivers build config 00:05:37.110 crypto/virtio: not in enabled drivers build config 00:05:37.110 compress/octeontx: not in enabled drivers build config 00:05:37.110 compress/zlib: not in enabled drivers build config 00:05:37.110 regex/mlx5: not in enabled drivers build config 00:05:37.110 regex/cn9k: not in enabled drivers build config 00:05:37.110 vdpa/ifc: not in enabled drivers build config 00:05:37.110 vdpa/mlx5: not in enabled drivers build config 00:05:37.110 vdpa/sfc: not in enabled drivers build config 00:05:37.110 event/cnxk: not in enabled drivers build config 00:05:37.110 event/dlb2: not in enabled drivers build config 00:05:37.110 event/dpaa: not in enabled drivers build config 00:05:37.110 event/dpaa2: not in enabled drivers build config 00:05:37.110 event/dsw: not in enabled drivers build config 00:05:37.110 event/opdl: not in enabled drivers build config 00:05:37.110 event/skeleton: not in enabled drivers build config 00:05:37.110 event/sw: not in enabled drivers build config 00:05:37.110 event/octeontx: not in enabled drivers build config 00:05:37.110 baseband/acc: not in enabled drivers build config 00:05:37.110 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:05:37.110 baseband/fpga_lte_fec: not in enabled drivers build config 00:05:37.110 baseband/la12xx: not in enabled drivers build config 00:05:37.110 baseband/null: not in enabled drivers build config 00:05:37.110 baseband/turbo_sw: not in enabled drivers build config 00:05:37.110 gpu/cuda: not in enabled drivers build config 00:05:37.110 00:05:37.110 00:05:37.110 Build targets in project: 355 00:05:37.110 00:05:37.110 DPDK 22.11.4 00:05:37.110 00:05:37.110 User defined options 00:05:37.110 libdir : lib 00:05:37.110 prefix : /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:05:37.110 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow -I/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:05:37.110 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:05:37.110 enable_docs : false 00:05:37.110 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,crypto,crypto/ipsec_mb,crypto/qat,compress/qat,common/qat,bus/auxiliary,common/mlx5,common/mlx5/linux,crypto/mlx5,compress,compress/isal,compress/qat,common/qat,compress/mlx5, 00:05:37.110 enable_kmods : false 00:05:37.110 machine : native 00:05:37.110 tests : false 00:05:37.110 00:05:37.110 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:05:37.110 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:05:37.110 16:59:31 build_native_dpdk -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp -j72 00:05:37.110 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp' 00:05:37.110 [1/854] Generating lib/rte_telemetry_mingw with a custom command 00:05:37.110 [2/854] Generating lib/rte_kvargs_def with a custom command 00:05:37.110 [3/854] Generating lib/rte_kvargs_mingw with a custom command 00:05:37.110 [4/854] Generating lib/rte_telemetry_def with a custom command 00:05:37.110 [5/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:05:37.110 [6/854] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:05:37.110 [7/854] Generating lib/rte_eal_def with a custom command 00:05:37.110 [8/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:05:37.110 [9/854] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:05:37.110 [10/854] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:05:37.110 [11/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:05:37.110 [12/854] Generating lib/rte_eal_mingw with a custom command 00:05:37.110 [13/854] Linking static target lib/librte_kvargs.a 00:05:37.110 [14/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:05:37.110 [15/854] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:05:37.110 [16/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:05:37.110 [17/854] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:05:37.110 [18/854] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:05:37.110 [19/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:05:37.110 [20/854] Generating lib/rte_ring_def with a custom command 00:05:37.110 [21/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:05:37.110 [22/854] Generating lib/rte_ring_mingw with a custom command 00:05:37.110 [23/854] Generating lib/rte_rcu_mingw with a custom command 00:05:37.110 [24/854] Generating lib/rte_rcu_def with a custom command 00:05:37.110 [25/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:05:37.110 [26/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:05:37.110 [27/854] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:05:37.111 [28/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:05:37.111 [29/854] Generating lib/rte_mempool_def with a custom command 00:05:37.111 [30/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:05:37.111 [31/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:05:37.111 [32/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:05:37.111 [33/854] Generating lib/rte_mempool_mingw with a custom command 00:05:37.111 [34/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:05:37.111 [35/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:05:37.111 [36/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:05:37.111 [37/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:05:37.111 [38/854] Generating lib/rte_mbuf_def with a custom command 00:05:37.111 [39/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:05:37.111 [40/854] Generating lib/rte_mbuf_mingw with a custom command 00:05:37.111 [41/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:05:37.111 [42/854] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:05:37.111 [43/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:05:37.111 [44/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:05:37.111 [45/854] Generating lib/rte_net_def with a custom command 00:05:37.111 [46/854] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:05:37.111 [47/854] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:05:37.111 [48/854] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:05:37.111 [49/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:05:37.111 [50/854] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:05:37.111 [51/854] Generating lib/rte_net_mingw with a custom command 00:05:37.111 [52/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:05:37.111 [53/854] Generating lib/rte_meter_def with a custom command 00:05:37.111 [54/854] Generating lib/rte_meter_mingw with a custom command 00:05:37.111 [55/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:05:37.111 [56/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:05:37.111 [57/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:05:37.111 [58/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:05:37.111 [59/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:05:37.111 [60/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:05:37.111 [61/854] Generating lib/rte_ethdev_def with a custom command 00:05:37.111 [62/854] Generating lib/rte_ethdev_mingw with a custom command 00:05:37.111 [63/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:05:37.111 [64/854] Generating lib/rte_pci_def with a custom command 00:05:37.111 [65/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:05:37.111 [66/854] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:05:37.111 [67/854] Generating lib/rte_pci_mingw with a custom command 00:05:37.111 [68/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:05:37.111 [69/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:05:37.111 [70/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:05:37.111 [71/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:05:37.111 [72/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:05:37.111 [73/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:05:37.111 [74/854] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:05:37.370 [75/854] Linking static target lib/librte_ring.a 00:05:37.370 [76/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:05:37.370 [77/854] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:05:37.370 [78/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:05:37.370 [79/854] Generating lib/rte_cmdline_mingw with a custom command 00:05:37.370 [80/854] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:05:37.370 [81/854] Generating lib/rte_cmdline_def with a custom command 00:05:37.370 [82/854] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:05:37.370 [83/854] Linking static target lib/librte_pci.a 00:05:37.370 [84/854] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:05:37.370 [85/854] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:05:37.370 [86/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:05:37.370 [87/854] Generating lib/rte_metrics_def with a custom command 00:05:37.370 [88/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:05:37.370 [89/854] Linking static target lib/librte_meter.a 00:05:37.370 [90/854] Generating lib/rte_metrics_mingw with a custom command 00:05:37.370 [91/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:05:37.370 [92/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:05:37.370 [93/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:05:37.370 [94/854] Generating lib/rte_hash_def with a custom command 00:05:37.370 [95/854] Generating lib/rte_hash_mingw with a custom command 00:05:37.370 [96/854] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:05:37.370 [97/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:05:37.370 [98/854] Generating lib/rte_timer_def with a custom command 00:05:37.370 [99/854] Generating lib/rte_timer_mingw with a custom command 00:05:37.370 [100/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:05:37.370 [101/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:05:37.370 [102/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:05:37.370 [103/854] Linking target lib/librte_kvargs.so.23.0 00:05:37.370 [104/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:05:37.370 [105/854] Generating lib/rte_acl_def with a custom command 00:05:37.370 [106/854] Generating lib/rte_acl_mingw with a custom command 00:05:37.633 [107/854] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:05:37.633 [108/854] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:05:37.633 [109/854] Generating lib/rte_bbdev_def with a custom command 00:05:37.633 [110/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:05:37.633 [111/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:05:37.633 [112/854] Generating lib/rte_bbdev_mingw with a custom command 00:05:37.633 [113/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:05:37.633 [114/854] Generating lib/rte_bitratestats_mingw with a custom command 00:05:37.633 [115/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:05:37.633 [116/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:05:37.633 [117/854] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:05:37.633 [118/854] Generating lib/rte_bitratestats_def with a custom command 00:05:37.633 [119/854] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:05:37.633 [120/854] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:05:37.633 [121/854] Generating lib/rte_bpf_def with a custom command 00:05:37.633 [122/854] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:05:37.633 [123/854] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:05:37.633 [124/854] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:05:37.633 [125/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:05:37.633 [126/854] Generating lib/rte_bpf_mingw with a custom command 00:05:37.633 [127/854] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:05:37.633 [128/854] Generating lib/rte_cfgfile_def with a custom command 00:05:37.633 [129/854] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:05:37.633 [130/854] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:05:37.898 [131/854] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:05:37.898 [132/854] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:05:37.898 [133/854] Generating lib/rte_compressdev_mingw with a custom command 00:05:37.898 [134/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:05:37.898 [135/854] Generating lib/rte_compressdev_def with a custom command 00:05:37.898 [136/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:05:37.898 [137/854] Generating lib/rte_cfgfile_mingw with a custom command 00:05:37.898 [138/854] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:05:37.898 [139/854] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:05:37.898 [140/854] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:05:37.898 [141/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:05:37.898 [142/854] Generating lib/rte_cryptodev_def with a custom command 00:05:37.898 [143/854] Generating lib/rte_cryptodev_mingw with a custom command 00:05:37.898 [144/854] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:05:37.898 [145/854] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:05:37.898 [146/854] Generating lib/rte_efd_def with a custom command 00:05:37.898 [147/854] Linking static target lib/librte_telemetry.a 00:05:37.898 [148/854] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:05:37.898 [149/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:05:37.898 [150/854] Generating lib/rte_efd_mingw with a custom command 00:05:37.898 [151/854] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:05:37.898 [152/854] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:05:37.898 [153/854] Generating lib/rte_distributor_mingw with a custom command 00:05:37.898 [154/854] Generating lib/rte_distributor_def with a custom command 00:05:37.898 [155/854] Generating lib/rte_eventdev_def with a custom command 00:05:37.898 [156/854] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:05:37.898 [157/854] Generating lib/rte_eventdev_mingw with a custom command 00:05:37.898 [158/854] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:05:37.898 [159/854] Linking static target lib/net/libnet_crc_avx512_lib.a 00:05:37.898 [160/854] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:05:37.898 [161/854] Generating lib/rte_gpudev_mingw with a custom command 00:05:37.898 [162/854] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:05:37.898 [163/854] Generating lib/rte_gpudev_def with a custom command 00:05:37.898 [164/854] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:05:37.898 [165/854] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:05:38.163 [166/854] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:05:38.163 [167/854] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:05:38.163 [168/854] Linking static target lib/librte_metrics.a 00:05:38.163 [169/854] Linking static target lib/librte_timer.a 00:05:38.163 [170/854] Linking static target lib/librte_net.a 00:05:38.163 [171/854] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:05:38.163 [172/854] Generating lib/rte_gro_def with a custom command 00:05:38.163 [173/854] Generating lib/rte_gro_mingw with a custom command 00:05:38.163 [174/854] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:05:38.163 [175/854] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:05:38.163 [176/854] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:05:38.163 [177/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:05:38.163 [178/854] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:05:38.163 [179/854] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:05:38.163 [180/854] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:05:38.163 [181/854] Linking static target lib/librte_cfgfile.a 00:05:38.163 [182/854] Linking static target lib/librte_rcu.a 00:05:38.163 [183/854] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:05:38.163 [184/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:05:38.163 [185/854] Generating lib/rte_gso_def with a custom command 00:05:38.163 [186/854] Generating lib/rte_gso_mingw with a custom command 00:05:38.163 [187/854] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:05:38.163 [188/854] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:05:38.163 [189/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:05:38.163 [190/854] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:05:38.426 [191/854] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:05:38.426 [192/854] Generating lib/rte_ip_frag_def with a custom command 00:05:38.426 [193/854] Linking static target lib/librte_cmdline.a 00:05:38.426 [194/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:05:38.426 [195/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:05:38.426 [196/854] Generating lib/rte_jobstats_def with a custom command 00:05:38.426 [197/854] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:05:38.426 [198/854] Generating lib/rte_jobstats_mingw with a custom command 00:05:38.426 [199/854] Generating lib/rte_ip_frag_mingw with a custom command 00:05:38.426 [200/854] Linking static target lib/librte_bitratestats.a 00:05:38.426 [201/854] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:05:38.426 [202/854] Generating lib/rte_latencystats_def with a custom command 00:05:38.426 [203/854] Generating lib/rte_latencystats_mingw with a custom command 00:05:38.426 [204/854] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.426 [205/854] Generating lib/rte_lpm_def with a custom command 00:05:38.426 [206/854] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:05:38.426 [207/854] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:05:38.426 [208/854] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:05:38.426 [209/854] Generating lib/rte_lpm_mingw with a custom command 00:05:38.426 [210/854] Linking target lib/librte_telemetry.so.23.0 00:05:38.426 [211/854] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.426 [212/854] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:05:38.693 [213/854] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:05:38.693 [214/854] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:05:38.693 [215/854] Linking static target lib/librte_jobstats.a 00:05:38.693 [216/854] Generating lib/rte_member_mingw with a custom command 00:05:38.693 [217/854] Generating lib/rte_member_def with a custom command 00:05:38.693 [218/854] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:05:38.693 [219/854] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:05:38.693 [220/854] Generating lib/rte_pcapng_mingw with a custom command 00:05:38.693 [221/854] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:05:38.693 [222/854] Generating lib/rte_pcapng_def with a custom command 00:05:38.693 [223/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:05:38.693 [224/854] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.693 [225/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:05:38.693 [226/854] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:05:38.693 [227/854] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:05:38.693 [228/854] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:05:38.694 [229/854] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:05:38.694 [230/854] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.694 [231/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:05:38.694 [232/854] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.694 [233/854] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:05:38.694 [234/854] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:05:38.958 [235/854] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:05:38.958 [236/854] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:05:38.958 [237/854] Linking static target lib/librte_compressdev.a 00:05:38.958 [238/854] Generating lib/rte_power_def with a custom command 00:05:38.958 [239/854] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:05:38.958 [240/854] Generating lib/rte_rawdev_def with a custom command 00:05:38.958 [241/854] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:05:38.958 [242/854] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:05:38.958 [243/854] Generating lib/rte_power_mingw with a custom command 00:05:38.958 [244/854] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:05:38.958 [245/854] Linking static target lib/librte_eal.a 00:05:38.958 [246/854] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:05:38.958 [247/854] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:05:38.958 [248/854] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:05:38.958 [249/854] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:05:38.958 [250/854] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.958 [251/854] Generating lib/rte_rawdev_mingw with a custom command 00:05:38.958 [252/854] Linking static target lib/librte_bbdev.a 00:05:38.958 [253/854] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:05:38.958 [254/854] Linking static target lib/librte_mempool.a 00:05:38.958 [255/854] Generating lib/rte_regexdev_def with a custom command 00:05:38.958 [256/854] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:05:38.958 [257/854] Generating lib/rte_dmadev_def with a custom command 00:05:38.958 [258/854] Generating lib/rte_regexdev_mingw with a custom command 00:05:38.958 [259/854] Generating lib/rte_rib_def with a custom command 00:05:38.958 [260/854] Generating lib/rte_rib_mingw with a custom command 00:05:38.958 [261/854] Generating lib/rte_reorder_def with a custom command 00:05:38.958 [262/854] Generating lib/rte_reorder_mingw with a custom command 00:05:38.958 [263/854] Generating lib/rte_dmadev_mingw with a custom command 00:05:38.958 [264/854] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:05:38.958 [265/854] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:05:38.958 [266/854] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:05:38.958 [267/854] Generating lib/rte_sched_def with a custom command 00:05:38.958 [268/854] Generating lib/rte_sched_mingw with a custom command 00:05:38.958 [269/854] Generating lib/rte_security_def with a custom command 00:05:38.958 [270/854] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:05:38.958 [271/854] Generating lib/rte_security_mingw with a custom command 00:05:38.958 [272/854] Generating lib/rte_stack_def with a custom command 00:05:38.958 [273/854] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:05:38.958 [274/854] Generating lib/rte_stack_mingw with a custom command 00:05:38.958 [275/854] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:05:38.958 [276/854] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:05:38.958 [277/854] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:05:39.230 [278/854] Linking static target lib/librte_gpudev.a 00:05:39.230 [279/854] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:05:39.230 [280/854] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:05:39.230 [281/854] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:05:39.230 [282/854] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:05:39.230 [283/854] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:05:39.230 [284/854] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:05:39.230 [285/854] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:05:39.230 [286/854] Generating lib/rte_vhost_def with a custom command 00:05:39.230 [287/854] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:05:39.230 [288/854] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:05:39.230 [289/854] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:05:39.230 [290/854] Generating lib/rte_ipsec_def with a custom command 00:05:39.230 [291/854] Generating lib/rte_vhost_mingw with a custom command 00:05:39.230 [292/854] Generating lib/rte_ipsec_mingw with a custom command 00:05:39.230 [293/854] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:05:39.230 [294/854] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:05:39.230 [295/854] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:05:39.230 [296/854] Linking static target lib/librte_gro.a 00:05:39.230 [297/854] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:05:39.230 [298/854] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:05:39.230 [299/854] Linking static target lib/librte_distributor.a 00:05:39.230 [300/854] Linking static target lib/librte_stack.a 00:05:39.495 [301/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:05:39.495 [302/854] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:05:39.495 [303/854] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:05:39.495 [304/854] Linking static target lib/librte_gso.a 00:05:39.495 [305/854] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:05:39.495 [306/854] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:05:39.495 [307/854] Generating lib/rte_fib_def with a custom command 00:05:39.495 [308/854] Linking static target lib/librte_latencystats.a 00:05:39.495 [309/854] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:05:39.495 [310/854] Generating lib/rte_fib_mingw with a custom command 00:05:39.495 [311/854] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:05:39.495 [312/854] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:05:39.495 [313/854] Linking static target lib/librte_mbuf.a 00:05:39.495 [314/854] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:05:39.495 [315/854] Linking static target lib/librte_ip_frag.a 00:05:39.495 [316/854] Linking static target lib/librte_rawdev.a 00:05:39.757 [317/854] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:05:39.757 [318/854] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:05:39.757 [319/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:05:39.757 [320/854] Linking static target lib/librte_dmadev.a 00:05:39.757 [321/854] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:05:39.757 [322/854] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:05:39.757 [323/854] Linking static target lib/member/libsketch_avx512_tmp.a 00:05:39.757 [324/854] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:05:39.757 [325/854] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:05:39.757 [326/854] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:05:39.757 [327/854] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:05:39.757 [328/854] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:05:39.757 [329/854] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:05:39.757 [330/854] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:05:39.757 [331/854] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:05:39.757 [332/854] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:05:39.757 [333/854] Generating lib/rte_port_def with a custom command 00:05:39.757 [334/854] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:05:39.757 [335/854] Generating lib/rte_port_mingw with a custom command 00:05:39.757 [336/854] Linking static target lib/librte_bpf.a 00:05:39.757 [337/854] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:05:39.757 [338/854] Generating lib/rte_pdump_def with a custom command 00:05:39.757 [339/854] Generating lib/rte_pdump_mingw with a custom command 00:05:39.757 [340/854] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:05:39.757 [341/854] Linking static target lib/librte_regexdev.a 00:05:40.024 [342/854] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.024 [343/854] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:05:40.024 [344/854] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:05:40.024 [345/854] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:05:40.024 [346/854] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:05:40.024 [347/854] Linking static target lib/librte_power.a 00:05:40.024 [348/854] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:05:40.024 [349/854] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:05:40.024 [350/854] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.024 [351/854] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:05:40.024 [352/854] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.024 [353/854] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:05:40.024 [354/854] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:05:40.024 [355/854] Linking static target lib/librte_reorder.a 00:05:40.024 [356/854] Linking static target lib/librte_pcapng.a 00:05:40.024 [357/854] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:05:40.024 [358/854] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:05:40.024 [359/854] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:05:40.024 [360/854] Linking static target lib/librte_security.a 00:05:40.024 [361/854] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.024 [362/854] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:05:40.288 [363/854] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:05:40.288 [364/854] Generating lib/rte_table_def with a custom command 00:05:40.288 [365/854] Generating lib/rte_table_mingw with a custom command 00:05:40.288 [366/854] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:05:40.288 [367/854] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:05:40.288 [368/854] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:05:40.288 [369/854] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:05:40.288 [370/854] Generating lib/rte_pipeline_def with a custom command 00:05:40.288 [371/854] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:05:40.288 [372/854] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.288 [373/854] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.548 [374/854] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:05:40.548 [375/854] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.548 [376/854] Generating lib/rte_pipeline_mingw with a custom command 00:05:40.548 [377/854] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:05:40.548 [378/854] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:05:40.548 [379/854] Linking static target lib/librte_efd.a 00:05:40.548 [380/854] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.548 [381/854] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:05:40.548 [382/854] Generating lib/rte_graph_def with a custom command 00:05:40.548 [383/854] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.549 [384/854] Generating lib/rte_graph_mingw with a custom command 00:05:40.549 [385/854] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:05:40.549 [386/854] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:05:40.549 [387/854] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.549 [388/854] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.811 [389/854] Compiling C object lib/librte_node.a.p/node_null.c.o 00:05:40.811 [390/854] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.811 [391/854] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:05:40.811 [392/854] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:05:40.811 [393/854] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:05:40.812 [394/854] Generating lib/rte_node_def with a custom command 00:05:40.812 [395/854] Generating lib/rte_node_mingw with a custom command 00:05:40.812 [396/854] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:05:40.812 [397/854] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:05:40.812 [398/854] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:05:40.812 [399/854] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:05:40.812 [400/854] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:05:40.812 [401/854] Generating drivers/rte_bus_auxiliary_def with a custom command 00:05:40.812 [402/854] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:05:40.812 [403/854] Generating drivers/rte_bus_auxiliary_mingw with a custom command 00:05:40.812 [404/854] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:40.812 [405/854] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:05:41.070 [406/854] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:05:41.070 [407/854] Generating drivers/rte_bus_pci_def with a custom command 00:05:41.070 [408/854] Generating drivers/rte_bus_pci_mingw with a custom command 00:05:41.070 [409/854] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:05:41.070 [410/854] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:05:41.070 [411/854] Generating drivers/rte_bus_vdev_mingw with a custom command 00:05:41.070 [412/854] Generating drivers/rte_bus_vdev_def with a custom command 00:05:41.070 [413/854] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:05:41.070 [414/854] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:05:41.070 [415/854] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:05:41.071 [416/854] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:05:41.071 [417/854] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:05:41.071 [418/854] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:05:41.071 [419/854] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:05:41.071 [420/854] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:05:41.071 [421/854] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:05:41.071 [422/854] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:05:41.071 [423/854] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:05:41.071 [424/854] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:05:41.071 [425/854] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:05:41.071 [426/854] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:05:41.071 [427/854] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:05:41.334 [428/854] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:05:41.334 [429/854] Generating drivers/rte_common_mlx5_def with a custom command 00:05:41.334 [430/854] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:05:41.334 [431/854] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:05:41.334 [432/854] Linking static target lib/librte_fib.a 00:05:41.334 [433/854] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:05:41.334 [434/854] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:05:41.334 [435/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:05:41.334 [436/854] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:05:41.334 [437/854] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:05:41.334 [438/854] Generating drivers/rte_common_mlx5_mingw with a custom command 00:05:41.334 [439/854] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:05:41.334 [440/854] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:05:41.334 [441/854] Compiling C object lib/librte_node.a.p/node_log.c.o 00:05:41.334 [442/854] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:05:41.334 [443/854] Linking static target lib/librte_graph.a 00:05:41.334 [444/854] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:05:41.596 [445/854] Linking static target drivers/libtmp_rte_bus_vdev.a 00:05:41.596 [446/854] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:05:41.596 [447/854] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:05:41.596 [448/854] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:05:41.596 [449/854] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:05:41.596 [450/854] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:05:41.597 [451/854] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:41.597 [452/854] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:05:41.597 [453/854] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:05:41.597 [454/854] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:05:41.597 [455/854] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:05:41.597 [456/854] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:05:41.597 [457/854] Linking static target lib/librte_cryptodev.a 00:05:41.860 [458/854] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:05:41.860 [459/854] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:05:41.860 [460/854] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:05:41.860 [461/854] Linking static target lib/librte_ethdev.a 00:05:41.860 [462/854] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:05:41.860 [463/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:05:41.860 [464/854] Generating drivers/rte_common_qat_def with a custom command 00:05:41.860 [465/854] Generating drivers/rte_common_qat_mingw with a custom command 00:05:41.860 [466/854] Generating drivers/rte_mempool_ring_def with a custom command 00:05:41.860 [467/854] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:05:41.860 [468/854] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:05:41.860 [469/854] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:05:41.860 [470/854] Linking static target lib/librte_pdump.a 00:05:41.860 [471/854] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:05:41.860 [472/854] Linking static target drivers/libtmp_rte_bus_pci.a 00:05:41.860 [473/854] Generating drivers/rte_mempool_ring_mingw with a custom command 00:05:41.860 [474/854] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:05:41.860 [475/854] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:05:41.860 [476/854] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:05:41.860 [477/854] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:05:41.860 [478/854] Linking static target drivers/librte_bus_vdev.a 00:05:41.860 [479/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:05:41.860 [480/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:05:41.860 [481/854] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:05:41.860 [482/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:05:41.860 [483/854] Linking static target lib/librte_member.a 00:05:42.120 [484/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:05:42.120 [485/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:05:42.120 [486/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:05:42.120 [487/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:05:42.120 [488/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:05:42.120 [489/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:05:42.120 [490/854] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:05:42.120 [491/854] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:05:42.120 [492/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:05:42.120 [493/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:05:42.120 [494/854] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:05:42.120 [495/854] Linking static target lib/librte_sched.a 00:05:42.120 [496/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:05:42.120 [497/854] Compiling C object drivers/librte_bus_auxiliary.so.23.0.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:05:42.120 [498/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:05:42.120 [499/854] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:05:42.120 [500/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:05:42.120 [501/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:05:42.120 [502/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:05:42.120 [503/854] Linking static target drivers/librte_bus_auxiliary.a 00:05:42.120 [504/854] Generating drivers/rte_net_i40e_def with a custom command 00:05:42.121 [505/854] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:05:42.121 [506/854] Generating drivers/rte_net_i40e_mingw with a custom command 00:05:42.121 [507/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:05:42.121 [508/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:05:42.380 [509/854] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:05:42.380 [510/854] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:05:42.381 [511/854] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:05:42.381 [512/854] Linking static target drivers/librte_bus_pci.a 00:05:42.381 [513/854] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:05:42.381 [514/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:05:42.381 [515/854] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:05:42.381 [516/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:05:42.381 [517/854] Generating drivers/rte_crypto_ipsec_mb_def with a custom command 00:05:42.381 [518/854] Generating drivers/rte_crypto_ipsec_mb_mingw with a custom command 00:05:42.381 [519/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:05:42.381 [520/854] Generating drivers/rte_crypto_mlx5_def with a custom command 00:05:42.381 [521/854] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:05:42.381 [522/854] Linking static target lib/librte_ipsec.a 00:05:42.381 [523/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:05:42.381 [524/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:05:42.381 [525/854] Generating drivers/rte_crypto_mlx5_mingw with a custom command 00:05:42.381 [526/854] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:42.381 [527/854] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:05:42.643 [528/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:05:42.643 [529/854] Linking static target lib/librte_hash.a 00:05:42.643 [530/854] Generating drivers/rte_compress_isal_def with a custom command 00:05:42.643 [531/854] Generating drivers/rte_compress_isal_mingw with a custom command 00:05:42.643 [532/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:05:42.643 [533/854] Generating drivers/rte_compress_mlx5_mingw with a custom command 00:05:42.643 [534/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:05:42.643 [535/854] Generating drivers/rte_compress_mlx5_def with a custom command 00:05:42.643 [536/854] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:05:42.643 [537/854] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:05:42.643 [538/854] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:05:42.643 [539/854] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:05:42.643 [540/854] Linking static target lib/librte_rib.a 00:05:42.643 [541/854] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:05:42.643 [542/854] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:05:42.643 [543/854] Linking static target lib/librte_lpm.a 00:05:42.643 [544/854] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:05:42.643 [545/854] Linking static target lib/librte_eventdev.a 00:05:42.902 [546/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:05:42.902 [547/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:05:42.902 [548/854] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:05:42.902 [549/854] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:05:42.902 [550/854] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:05:42.902 [551/854] Linking static target lib/librte_node.a 00:05:42.902 [552/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:05:42.902 [553/854] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:05:42.902 [554/854] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.163 [555/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:05:43.163 [556/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:05:43.163 [557/854] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:05:43.163 [558/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:05:43.163 [559/854] Linking static target lib/librte_port.a 00:05:43.163 [560/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:05:43.163 [561/854] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:05:43.163 [562/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:05:43.163 [563/854] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.163 [564/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:05:43.163 [565/854] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:05:43.163 [566/854] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.430 [567/854] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.430 [568/854] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.430 [569/854] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:05:43.430 [570/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:05:43.430 [571/854] Linking static target drivers/libtmp_rte_mempool_ring.a 00:05:43.430 [572/854] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:05:43.430 [573/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:05:43.430 [574/854] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.430 [575/854] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:05:43.430 [576/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:05:43.689 [577/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:05:43.689 [578/854] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:05:43.689 [579/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:05:43.689 [580/854] Linking static target lib/librte_acl.a 00:05:43.689 [581/854] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:05:43.689 [582/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:05:43.689 [583/854] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:05:43.689 [584/854] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:05:43.951 [585/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:05:43.951 [586/854] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:05:43.951 [587/854] Linking static target drivers/librte_mempool_ring.a 00:05:43.951 [588/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:05:43.951 [589/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:05:43.951 [590/854] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:05:43.951 [591/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:05:43.951 [592/854] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:05:43.951 [593/854] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:05:43.951 [594/854] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:05:43.951 [595/854] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:05:43.951 [596/854] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:05:43.951 [597/854] Linking static target lib/librte_table.a 00:05:44.216 [598/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:05:44.216 [599/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:05:44.216 [600/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:05:44.216 [601/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:05:44.216 [602/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:05:44.216 [603/854] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:05:44.216 [604/854] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:05:44.216 [605/854] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:05:44.216 [606/854] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:05:44.216 [607/854] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:05:44.480 [608/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:05:44.480 [609/854] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:05:44.480 [610/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:05:44.480 [611/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:05:44.480 [612/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:05:44.480 [613/854] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:05:44.480 [614/854] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:05:44.480 [615/854] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:05:44.480 [616/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:05:44.480 [617/854] Linking static target drivers/librte_compress_mlx5.a 00:05:44.480 [618/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:05:44.480 [619/854] Compiling C object drivers/librte_compress_mlx5.so.23.0.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:05:44.480 [620/854] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:05:44.480 [621/854] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:05:44.750 [622/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:05:44.750 [623/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:05:44.750 [624/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:05:44.750 [625/854] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:05:44.750 [626/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:05:44.750 [627/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:05:44.750 [628/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:05:44.750 [629/854] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:05:44.750 [630/854] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:05:44.750 [631/854] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:05:44.750 [632/854] Linking static target drivers/librte_crypto_mlx5.a 00:05:44.750 [633/854] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:05:44.750 [634/854] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:05:44.750 [635/854] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:05:44.750 [636/854] Compiling C object drivers/librte_crypto_mlx5.so.23.0.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:05:44.750 [637/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:05:44.750 [638/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:05:45.012 [639/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:05:45.012 [640/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:05:45.012 [641/854] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:05:45.012 [642/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:05:45.012 [643/854] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:05:45.012 [644/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:05:45.012 [645/854] Linking static target drivers/libtmp_rte_compress_isal.a 00:05:45.012 [646/854] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:05:45.012 [647/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:05:45.012 [648/854] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:05:45.012 [649/854] Linking static target drivers/libtmp_rte_common_mlx5.a 00:05:45.012 [650/854] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:05:45.012 [651/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:05:45.012 [652/854] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:05:45.271 [653/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:05:45.271 [654/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:05:45.271 [655/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:05:45.271 [656/854] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:05:45.271 [657/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:05:45.271 [658/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:05:45.271 [659/854] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:05:45.271 [660/854] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:05:45.271 [661/854] Compiling C object drivers/librte_compress_isal.so.23.0.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:05:45.271 [662/854] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:05:45.271 [663/854] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:05:45.271 [664/854] Linking static target drivers/librte_compress_isal.a 00:05:45.271 [665/854] Linking static target drivers/net/i40e/base/libi40e_base.a 00:05:45.271 [666/854] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:05:45.529 [667/854] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:05:45.529 [668/854] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:05:45.529 [669/854] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:05:45.529 [670/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:05:45.529 [671/854] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:05:45.529 [672/854] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:05:45.529 [673/854] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:05:45.529 [674/854] Compiling C object drivers/librte_common_mlx5.so.23.0.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:05:45.529 [675/854] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:05:45.529 [676/854] Linking static target drivers/librte_common_mlx5.a 00:05:45.529 [677/854] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:05:45.787 [678/854] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:05:45.787 [679/854] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:05:45.787 [680/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:05:45.787 [681/854] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:05:45.787 [682/854] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:05:45.787 [683/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:05:45.787 [684/854] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:05:45.787 [685/854] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:05:45.787 [686/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:05:45.787 [687/854] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:05:45.787 [688/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:05:45.787 [689/854] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:45.787 [690/854] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:05:45.787 [691/854] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:05:45.787 [692/854] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:05:46.046 [693/854] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:05:46.046 [694/854] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:05:46.046 [695/854] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:05:46.046 [696/854] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:05:46.046 [697/854] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:05:46.305 [698/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:05:46.305 [699/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:05:46.305 [700/854] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:46.305 [701/854] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:05:46.563 [702/854] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:05:46.563 [703/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:05:46.563 [704/854] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:05:46.823 [705/854] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:05:46.823 [706/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:05:46.823 [707/854] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:05:47.081 [708/854] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:05:47.081 [709/854] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:05:47.081 [710/854] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:05:47.341 [711/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:05:47.341 [712/854] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:05:47.341 [713/854] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:05:47.600 [714/854] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:05:47.861 [715/854] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:05:47.861 [716/854] Compiling C object drivers/librte_crypto_ipsec_mb.so.23.0.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:05:47.861 [717/854] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:05:47.861 [718/854] Linking static target drivers/librte_crypto_ipsec_mb.a 00:05:48.426 [719/854] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:05:48.426 [720/854] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:05:48.685 [721/854] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:05:48.685 [722/854] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:05:48.685 [723/854] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:05:48.685 [724/854] Linking static target drivers/libtmp_rte_net_i40e.a 00:05:48.943 [725/854] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:05:49.201 [726/854] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:05:49.201 [727/854] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:05:49.201 [728/854] Linking static target drivers/librte_net_i40e.a 00:05:49.770 [729/854] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:05:49.770 [730/854] Linking static target drivers/libtmp_rte_common_qat.a 00:05:50.030 [731/854] Generating drivers/rte_common_qat.pmd.c with a custom command 00:05:50.030 [732/854] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:05:50.030 [733/854] Compiling C object drivers/librte_common_qat.so.23.0.p/meson-generated_.._rte_common_qat.pmd.c.o 00:05:50.030 [734/854] Linking static target drivers/librte_common_qat.a 00:05:50.030 [735/854] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:05:50.288 [736/854] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:05:50.857 [737/854] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:05:51.116 [738/854] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:05:53.650 [739/854] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:05:53.650 [740/854] Linking target lib/librte_eal.so.23.0 00:05:53.650 [741/854] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:05:53.650 [742/854] Linking target lib/librte_ring.so.23.0 00:05:53.650 [743/854] Linking target lib/librte_rawdev.so.23.0 00:05:53.650 [744/854] Linking target lib/librte_pci.so.23.0 00:05:53.650 [745/854] Linking target lib/librte_meter.so.23.0 00:05:53.650 [746/854] Linking target lib/librte_cfgfile.so.23.0 00:05:53.650 [747/854] Linking target lib/librte_jobstats.so.23.0 00:05:53.650 [748/854] Linking target lib/librte_dmadev.so.23.0 00:05:53.650 [749/854] Linking target lib/librte_timer.so.23.0 00:05:53.650 [750/854] Linking target drivers/librte_bus_vdev.so.23.0 00:05:53.650 [751/854] Linking target lib/librte_stack.so.23.0 00:05:53.650 [752/854] Linking target drivers/librte_bus_auxiliary.so.23.0 00:05:53.650 [753/854] Linking target lib/librte_graph.so.23.0 00:05:53.650 [754/854] Linking target lib/librte_acl.so.23.0 00:05:53.909 [755/854] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:05:53.909 [756/854] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:05:53.909 [757/854] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:05:53.909 [758/854] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:05:53.909 [759/854] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:05:53.909 [760/854] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:05:53.909 [761/854] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:05:53.909 [762/854] Generating symbol file drivers/librte_bus_auxiliary.so.23.0.p/librte_bus_auxiliary.so.23.0.symbols 00:05:53.909 [763/854] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:05:53.909 [764/854] Linking target drivers/librte_bus_pci.so.23.0 00:05:53.909 [765/854] Linking target lib/librte_rcu.so.23.0 00:05:53.909 [766/854] Linking target lib/librte_mempool.so.23.0 00:05:53.909 [767/854] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:05:54.168 [768/854] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:05:54.168 [769/854] Linking target lib/librte_rib.so.23.0 00:05:54.168 [770/854] Linking target lib/librte_mbuf.so.23.0 00:05:54.168 [771/854] Linking target drivers/librte_mempool_ring.so.23.0 00:05:54.168 [772/854] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:05:54.168 [773/854] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:05:54.168 [774/854] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:05:54.168 [775/854] Linking target lib/librte_gpudev.so.23.0 00:05:54.168 [776/854] Linking target lib/librte_bbdev.so.23.0 00:05:54.168 [777/854] Linking target lib/librte_net.so.23.0 00:05:54.426 [778/854] Linking target lib/librte_cryptodev.so.23.0 00:05:54.426 [779/854] Linking target lib/librte_regexdev.so.23.0 00:05:54.426 [780/854] Linking target lib/librte_reorder.so.23.0 00:05:54.426 [781/854] Linking target lib/librte_distributor.so.23.0 00:05:54.426 [782/854] Linking target lib/librte_sched.so.23.0 00:05:54.426 [783/854] Linking target lib/librte_fib.so.23.0 00:05:54.426 [784/854] Linking target lib/librte_compressdev.so.23.0 00:05:54.426 [785/854] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:05:54.426 [786/854] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:05:54.426 [787/854] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:05:54.426 [788/854] Generating symbol file lib/librte_compressdev.so.23.0.p/librte_compressdev.so.23.0.symbols 00:05:54.426 [789/854] Linking target lib/librte_hash.so.23.0 00:05:54.426 [790/854] Linking target lib/librte_cmdline.so.23.0 00:05:54.426 [791/854] Linking target lib/librte_security.so.23.0 00:05:54.426 [792/854] Linking target lib/librte_ethdev.so.23.0 00:05:54.684 [793/854] Linking target drivers/librte_compress_isal.so.23.0 00:05:54.684 [794/854] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:05:54.684 [795/854] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:05:54.684 [796/854] Linking target lib/librte_efd.so.23.0 00:05:54.684 [797/854] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:05:54.684 [798/854] Linking target lib/librte_lpm.so.23.0 00:05:54.684 [799/854] Linking target lib/librte_member.so.23.0 00:05:54.684 [800/854] Linking target drivers/librte_common_mlx5.so.23.0 00:05:54.684 [801/854] Linking target lib/librte_ipsec.so.23.0 00:05:54.684 [802/854] Linking target drivers/librte_crypto_ipsec_mb.so.23.0 00:05:54.684 [803/854] Linking target lib/librte_metrics.so.23.0 00:05:54.684 [804/854] Linking target lib/librte_ip_frag.so.23.0 00:05:54.684 [805/854] Linking target lib/librte_gso.so.23.0 00:05:54.684 [806/854] Linking target lib/librte_pcapng.so.23.0 00:05:54.684 [807/854] Linking target lib/librte_gro.so.23.0 00:05:54.684 [808/854] Linking target lib/librte_bpf.so.23.0 00:05:54.942 [809/854] Linking target lib/librte_eventdev.so.23.0 00:05:54.942 [810/854] Linking target drivers/librte_common_qat.so.23.0 00:05:54.942 [811/854] Linking target lib/librte_power.so.23.0 00:05:54.942 [812/854] Linking target drivers/librte_net_i40e.so.23.0 00:05:54.942 [813/854] Generating symbol file drivers/librte_common_mlx5.so.23.0.p/librte_common_mlx5.so.23.0.symbols 00:05:54.942 [814/854] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:05:54.942 [815/854] Linking target drivers/librte_compress_mlx5.so.23.0 00:05:54.942 [816/854] Linking target drivers/librte_crypto_mlx5.so.23.0 00:05:54.942 [817/854] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:05:54.942 [818/854] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:05:54.942 [819/854] Linking target lib/librte_node.so.23.0 00:05:54.942 [820/854] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:05:54.942 [821/854] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:05:55.200 [822/854] Linking target lib/librte_port.so.23.0 00:05:55.200 [823/854] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:05:55.200 [824/854] Linking target lib/librte_pdump.so.23.0 00:05:55.200 [825/854] Linking target lib/librte_latencystats.so.23.0 00:05:55.200 [826/854] Linking target lib/librte_bitratestats.so.23.0 00:05:55.200 [827/854] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:05:55.200 [828/854] Linking target lib/librte_table.so.23.0 00:05:55.458 [829/854] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:05:57.993 [830/854] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:05:57.993 [831/854] Linking static target lib/librte_vhost.a 00:05:58.929 [832/854] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:05:58.929 [833/854] Linking static target lib/librte_pipeline.a 00:05:59.189 [834/854] Linking target app/dpdk-pdump 00:05:59.189 [835/854] Linking target app/dpdk-test-acl 00:05:59.189 [836/854] Linking target app/dpdk-test-flow-perf 00:05:59.541 [837/854] Linking target app/dpdk-dumpcap 00:05:59.541 [838/854] Linking target app/dpdk-test-cmdline 00:05:59.541 [839/854] Linking target app/dpdk-proc-info 00:05:59.541 [840/854] Linking target app/dpdk-test-fib 00:05:59.541 [841/854] Linking target app/dpdk-test-gpudev 00:05:59.541 [842/854] Linking target app/dpdk-test-crypto-perf 00:05:59.541 [843/854] Linking target app/dpdk-test-compress-perf 00:05:59.541 [844/854] Linking target app/dpdk-test-sad 00:05:59.541 [845/854] Linking target app/dpdk-test-eventdev 00:05:59.541 [846/854] Linking target app/dpdk-test-security-perf 00:05:59.541 [847/854] Linking target app/dpdk-test-bbdev 00:05:59.541 [848/854] Linking target app/dpdk-test-pipeline 00:05:59.541 [849/854] Linking target app/dpdk-testpmd 00:06:00.109 [850/854] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:06:00.109 [851/854] Linking target lib/librte_vhost.so.23.0 00:06:00.109 [852/854] Linking target app/dpdk-test-regex 00:06:04.309 [853/854] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:06:04.309 [854/854] Linking target lib/librte_pipeline.so.23.0 00:06:04.309 16:59:59 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:06:04.309 16:59:59 build_native_dpdk -- common/autobuild_common.sh@191 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:06:04.309 16:59:59 build_native_dpdk -- common/autobuild_common.sh@204 -- $ ninja -C /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp -j72 install 00:06:04.309 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp' 00:06:04.309 [0/1] Installing files. 00:06:04.571 Installing subdir /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.571 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:06:04.572 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.573 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.574 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.834 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:06:04.835 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:06:04.835 Installing lib/librte_kvargs.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_telemetry.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_eal.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_ring.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_rcu.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_mempool.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_mbuf.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_net.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_meter.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_ethdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_pci.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_cmdline.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_metrics.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_hash.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_timer.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_acl.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_bbdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_bpf.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_compressdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_distributor.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_efd.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_eventdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_gpudev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_gro.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_gso.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_jobstats.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_latencystats.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_lpm.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_member.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_pcapng.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_power.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_rawdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_regexdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_dmadev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.835 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_rib.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_reorder.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_sched.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_security.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_stack.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_vhost.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_ipsec.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_fib.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_port.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_pdump.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_table.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_pipeline.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_graph.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_node.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing drivers/librte_bus_auxiliary.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing drivers/librte_bus_auxiliary.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:04.836 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:04.836 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:04.836 Installing drivers/librte_common_mlx5.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:04.836 Installing drivers/librte_common_mlx5.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_common_qat.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_common_qat.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_crypto_ipsec_mb.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_crypto_ipsec_mb.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_crypto_mlx5.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_crypto_mlx5.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_compress_isal.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_compress_isal.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing drivers/librte_compress_mlx5.a to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.098 Installing drivers/librte_compress_mlx5.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:06:05.098 Installing app/dpdk-dumpcap to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-pdump to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-proc-info to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-acl to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.098 Installing app/dpdk-test-fib to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-testpmd to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-test-regex to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-test-sad to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include/generic 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.099 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.100 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.101 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.102 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/bin 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/pkgconfig 00:06:05.103 Installing /var/jenkins/workspace/crypto-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/pkgconfig 00:06:05.103 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:06:05.103 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:06:05.103 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:06:05.103 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:06:05.103 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:06:05.103 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eal.so 00:06:05.103 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:06:05.103 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ring.so 00:06:05.103 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:06:05.103 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rcu.so 00:06:05.103 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:06:05.103 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mempool.so 00:06:05.103 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:06:05.103 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:06:05.103 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_net.so.23 00:06:05.103 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_net.so 00:06:05.103 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:06:05.103 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_meter.so 00:06:05.103 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:06:05.103 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:06:05.103 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:06:05.103 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pci.so 00:06:05.103 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:06:05.103 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:06:05.103 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:06:05.103 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_metrics.so 00:06:05.103 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:06:05.103 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_hash.so 00:06:05.103 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:06:05.103 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_timer.so 00:06:05.103 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:06:05.103 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_acl.so 00:06:05.103 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:06:05.103 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:06:05.103 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:06:05.103 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:06:05.103 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:06:05.103 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_bpf.so 00:06:05.103 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:06:05.103 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:06:05.103 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:06:05.103 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:06:05.103 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:06:05.103 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:06:05.103 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:06:05.103 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_distributor.so 00:06:05.103 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:06:05.103 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_efd.so 00:06:05.103 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:06:05.103 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:06:05.103 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:06:05.104 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:06:05.104 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:06:05.104 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gro.so 00:06:05.104 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:06:05.104 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_gso.so 00:06:05.104 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:06:05.104 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:06:05.104 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:06:05.104 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:06:05.104 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:06:05.104 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:06:05.104 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:06:05.104 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_lpm.so 00:06:05.104 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_member.so.23 00:06:05.104 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_member.so 00:06:05.104 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:06:05.104 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:06:05.104 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_power.so.23 00:06:05.104 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_power.so 00:06:05.104 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:06:05.104 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:06:05.104 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:06:05.104 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:06:05.104 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:06:05.104 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:06:05.104 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:06:05.104 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_rib.so 00:06:05.104 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:06:05.104 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_reorder.so 00:06:05.104 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:06:05.104 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_sched.so 00:06:05.104 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_security.so.23 00:06:05.104 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_security.so 00:06:05.104 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:06:05.104 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_stack.so 00:06:05.104 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:06:05.104 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_vhost.so 00:06:05.104 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:06:05.104 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:06:05.104 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:06:05.104 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_fib.so 00:06:05.104 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_port.so.23 00:06:05.104 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_port.so 00:06:05.104 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:06:05.104 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pdump.so 00:06:05.104 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_table.so.23 00:06:05.104 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_table.so 00:06:05.104 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:06:05.104 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:06:05.104 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:06:05.104 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_graph.so 00:06:05.104 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_node.so.23 00:06:05.104 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/librte_node.so 00:06:05.104 Installing symlink pointing to librte_bus_auxiliary.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_auxiliary.so.23 00:06:05.104 Installing symlink pointing to librte_bus_auxiliary.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_auxiliary.so 00:06:05.104 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:06:05.104 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:06:05.104 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:06:05.104 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:06:05.104 Installing symlink pointing to librte_common_mlx5.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_mlx5.so.23 00:06:05.104 Installing symlink pointing to librte_common_mlx5.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_mlx5.so 00:06:05.104 Installing symlink pointing to librte_common_qat.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_qat.so.23 00:06:05.104 Installing symlink pointing to librte_common_qat.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_qat.so 00:06:05.104 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:06:05.104 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:06:05.104 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:06:05.104 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:06:05.104 './librte_bus_auxiliary.so' -> 'dpdk/pmds-23.0/librte_bus_auxiliary.so' 00:06:05.104 './librte_bus_auxiliary.so.23' -> 'dpdk/pmds-23.0/librte_bus_auxiliary.so.23' 00:06:05.104 './librte_bus_auxiliary.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_auxiliary.so.23.0' 00:06:05.104 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:06:05.104 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:06:05.104 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:06:05.104 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:06:05.104 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:06:05.104 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:06:05.104 './librte_common_mlx5.so' -> 'dpdk/pmds-23.0/librte_common_mlx5.so' 00:06:05.104 './librte_common_mlx5.so.23' -> 'dpdk/pmds-23.0/librte_common_mlx5.so.23' 00:06:05.104 './librte_common_mlx5.so.23.0' -> 'dpdk/pmds-23.0/librte_common_mlx5.so.23.0' 00:06:05.104 './librte_common_qat.so' -> 'dpdk/pmds-23.0/librte_common_qat.so' 00:06:05.104 './librte_common_qat.so.23' -> 'dpdk/pmds-23.0/librte_common_qat.so.23' 00:06:05.104 './librte_common_qat.so.23.0' -> 'dpdk/pmds-23.0/librte_common_qat.so.23.0' 00:06:05.105 './librte_compress_isal.so' -> 'dpdk/pmds-23.0/librte_compress_isal.so' 00:06:05.105 './librte_compress_isal.so.23' -> 'dpdk/pmds-23.0/librte_compress_isal.so.23' 00:06:05.105 './librte_compress_isal.so.23.0' -> 'dpdk/pmds-23.0/librte_compress_isal.so.23.0' 00:06:05.105 './librte_compress_mlx5.so' -> 'dpdk/pmds-23.0/librte_compress_mlx5.so' 00:06:05.105 './librte_compress_mlx5.so.23' -> 'dpdk/pmds-23.0/librte_compress_mlx5.so.23' 00:06:05.105 './librte_compress_mlx5.so.23.0' -> 'dpdk/pmds-23.0/librte_compress_mlx5.so.23.0' 00:06:05.105 './librte_crypto_ipsec_mb.so' -> 'dpdk/pmds-23.0/librte_crypto_ipsec_mb.so' 00:06:05.105 './librte_crypto_ipsec_mb.so.23' -> 'dpdk/pmds-23.0/librte_crypto_ipsec_mb.so.23' 00:06:05.105 './librte_crypto_ipsec_mb.so.23.0' -> 'dpdk/pmds-23.0/librte_crypto_ipsec_mb.so.23.0' 00:06:05.105 './librte_crypto_mlx5.so' -> 'dpdk/pmds-23.0/librte_crypto_mlx5.so' 00:06:05.105 './librte_crypto_mlx5.so.23' -> 'dpdk/pmds-23.0/librte_crypto_mlx5.so.23' 00:06:05.105 './librte_crypto_mlx5.so.23.0' -> 'dpdk/pmds-23.0/librte_crypto_mlx5.so.23.0' 00:06:05.105 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:06:05.105 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:06:05.105 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:06:05.105 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:06:05.105 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:06:05.105 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:06:05.105 Installing symlink pointing to librte_crypto_ipsec_mb.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_ipsec_mb.so.23 00:06:05.105 Installing symlink pointing to librte_crypto_ipsec_mb.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_ipsec_mb.so 00:06:05.105 Installing symlink pointing to librte_crypto_mlx5.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_mlx5.so.23 00:06:05.105 Installing symlink pointing to librte_crypto_mlx5.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_mlx5.so 00:06:05.105 Installing symlink pointing to librte_compress_isal.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_isal.so.23 00:06:05.105 Installing symlink pointing to librte_compress_isal.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_isal.so 00:06:05.105 Installing symlink pointing to librte_compress_mlx5.so.23.0 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_mlx5.so.23 00:06:05.105 Installing symlink pointing to librte_compress_mlx5.so.23 to /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_mlx5.so 00:06:05.105 Running custom install script '/bin/sh /var/jenkins/workspace/crypto-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:06:05.364 17:00:00 build_native_dpdk -- common/autobuild_common.sh@210 -- $ cat 00:06:05.364 17:00:00 build_native_dpdk -- common/autobuild_common.sh@215 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:05.364 00:06:05.364 real 5m0.603s 00:06:05.364 user 22m28.958s 00:06:05.364 sys 3m2.688s 00:06:05.364 17:00:00 build_native_dpdk -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:06:05.364 17:00:00 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:06:05.364 ************************************ 00:06:05.364 END TEST build_native_dpdk 00:06:05.364 ************************************ 00:06:05.364 17:00:00 -- common/autotest_common.sh@1142 -- $ return 0 00:06:05.364 17:00:00 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:06:05.364 17:00:00 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:06:05.364 17:00:00 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:06:05.364 17:00:00 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:06:05.364 17:00:00 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:06:05.364 17:00:00 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:06:05.364 17:00:00 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:06:05.364 17:00:00 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build --with-shared 00:06:05.364 Using /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:06:05.624 DPDK libraries: /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:06:05.624 DPDK includes: //var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:06:05.624 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:06:06.192 Using 'verbs' RDMA provider 00:06:22.454 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:06:37.351 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:06:37.922 Creating mk/config.mk...done. 00:06:37.922 Creating mk/cc.flags.mk...done. 00:06:37.922 Type 'make' to build. 00:06:37.922 17:00:33 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:06:37.922 17:00:33 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:06:37.922 17:00:33 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:06:37.922 17:00:33 -- common/autotest_common.sh@10 -- $ set +x 00:06:37.922 ************************************ 00:06:37.922 START TEST make 00:06:37.922 ************************************ 00:06:37.922 17:00:33 make -- common/autotest_common.sh@1123 -- $ make -j72 00:06:38.493 make[1]: Nothing to be done for 'all'. 00:06:56.618 CC lib/ut/ut.o 00:06:56.618 CC lib/log/log.o 00:06:56.618 CC lib/log/log_flags.o 00:06:56.618 CC lib/log/log_deprecated.o 00:06:56.618 CC lib/ut_mock/mock.o 00:06:56.618 LIB libspdk_ut_mock.a 00:06:56.618 LIB libspdk_ut.a 00:06:56.618 LIB libspdk_log.a 00:06:56.618 SO libspdk_ut.so.2.0 00:06:56.618 SO libspdk_ut_mock.so.6.0 00:06:56.618 SO libspdk_log.so.7.0 00:06:56.618 SYMLINK libspdk_ut_mock.so 00:06:56.618 SYMLINK libspdk_log.so 00:06:56.618 SYMLINK libspdk_ut.so 00:06:56.618 CC lib/ioat/ioat.o 00:06:56.618 CC lib/util/base64.o 00:06:56.618 CXX lib/trace_parser/trace.o 00:06:56.618 CC lib/dma/dma.o 00:06:56.618 CC lib/util/bit_array.o 00:06:56.618 CC lib/util/cpuset.o 00:06:56.618 CC lib/util/crc16.o 00:06:56.618 CC lib/util/crc32.o 00:06:56.618 CC lib/util/crc32c.o 00:06:56.618 CC lib/util/crc32_ieee.o 00:06:56.618 CC lib/util/crc64.o 00:06:56.618 CC lib/util/dif.o 00:06:56.618 CC lib/util/fd.o 00:06:56.618 CC lib/util/fd_group.o 00:06:56.618 CC lib/util/file.o 00:06:56.618 CC lib/util/hexlify.o 00:06:56.618 CC lib/util/iov.o 00:06:56.618 CC lib/util/math.o 00:06:56.618 CC lib/util/net.o 00:06:56.618 CC lib/util/pipe.o 00:06:56.618 CC lib/util/strerror_tls.o 00:06:56.618 CC lib/util/string.o 00:06:56.618 CC lib/util/uuid.o 00:06:56.618 CC lib/util/xor.o 00:06:56.618 CC lib/util/zipf.o 00:06:56.618 CC lib/vfio_user/host/vfio_user_pci.o 00:06:56.618 CC lib/vfio_user/host/vfio_user.o 00:06:56.618 LIB libspdk_ioat.a 00:06:56.618 SO libspdk_ioat.so.7.0 00:06:56.618 LIB libspdk_dma.a 00:06:56.877 SYMLINK libspdk_ioat.so 00:06:56.877 LIB libspdk_vfio_user.a 00:06:56.877 SO libspdk_dma.so.4.0 00:06:56.877 LIB libspdk_util.a 00:06:56.877 SO libspdk_vfio_user.so.5.0 00:06:56.877 SYMLINK libspdk_dma.so 00:06:56.877 SO libspdk_util.so.10.0 00:06:56.877 SYMLINK libspdk_vfio_user.so 00:06:57.136 SYMLINK libspdk_util.so 00:06:57.136 LIB libspdk_trace_parser.a 00:06:57.394 SO libspdk_trace_parser.so.5.0 00:06:57.394 SYMLINK libspdk_trace_parser.so 00:06:57.394 CC lib/json/json_util.o 00:06:57.394 CC lib/json/json_parse.o 00:06:57.394 CC lib/json/json_write.o 00:06:57.394 CC lib/idxd/idxd.o 00:06:57.394 CC lib/idxd/idxd_user.o 00:06:57.394 CC lib/idxd/idxd_kernel.o 00:06:57.394 CC lib/reduce/reduce.o 00:06:57.394 CC lib/rdma_utils/rdma_utils.o 00:06:57.394 CC lib/env_dpdk/env.o 00:06:57.394 CC lib/env_dpdk/memory.o 00:06:57.394 CC lib/vmd/vmd.o 00:06:57.394 CC lib/vmd/led.o 00:06:57.394 CC lib/env_dpdk/pci.o 00:06:57.394 CC lib/env_dpdk/init.o 00:06:57.394 CC lib/conf/conf.o 00:06:57.394 CC lib/rdma_provider/common.o 00:06:57.394 CC lib/env_dpdk/threads.o 00:06:57.394 CC lib/rdma_provider/rdma_provider_verbs.o 00:06:57.394 CC lib/env_dpdk/pci_ioat.o 00:06:57.394 CC lib/env_dpdk/pci_virtio.o 00:06:57.394 CC lib/env_dpdk/pci_vmd.o 00:06:57.394 CC lib/env_dpdk/pci_idxd.o 00:06:57.394 CC lib/env_dpdk/pci_event.o 00:06:57.394 CC lib/env_dpdk/sigbus_handler.o 00:06:57.394 CC lib/env_dpdk/pci_dpdk_2207.o 00:06:57.394 CC lib/env_dpdk/pci_dpdk.o 00:06:57.394 CC lib/env_dpdk/pci_dpdk_2211.o 00:06:57.653 LIB libspdk_rdma_provider.a 00:06:57.653 SO libspdk_rdma_provider.so.6.0 00:06:57.653 LIB libspdk_conf.a 00:06:57.653 LIB libspdk_rdma_utils.a 00:06:57.912 SO libspdk_conf.so.6.0 00:06:57.912 LIB libspdk_json.a 00:06:57.912 SYMLINK libspdk_rdma_provider.so 00:06:57.912 SO libspdk_rdma_utils.so.1.0 00:06:57.912 SO libspdk_json.so.6.0 00:06:57.912 SYMLINK libspdk_conf.so 00:06:57.912 SYMLINK libspdk_rdma_utils.so 00:06:57.912 SYMLINK libspdk_json.so 00:06:58.169 LIB libspdk_idxd.a 00:06:58.169 SO libspdk_idxd.so.12.0 00:06:58.169 LIB libspdk_reduce.a 00:06:58.169 LIB libspdk_vmd.a 00:06:58.169 SO libspdk_reduce.so.6.1 00:06:58.169 SYMLINK libspdk_idxd.so 00:06:58.169 SO libspdk_vmd.so.6.0 00:06:58.169 SYMLINK libspdk_reduce.so 00:06:58.169 CC lib/jsonrpc/jsonrpc_server.o 00:06:58.169 CC lib/jsonrpc/jsonrpc_client.o 00:06:58.169 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:06:58.169 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:06:58.427 SYMLINK libspdk_vmd.so 00:06:58.427 LIB libspdk_jsonrpc.a 00:06:58.686 SO libspdk_jsonrpc.so.6.0 00:06:58.686 SYMLINK libspdk_jsonrpc.so 00:06:58.945 LIB libspdk_env_dpdk.a 00:06:58.945 SO libspdk_env_dpdk.so.15.0 00:06:59.202 SYMLINK libspdk_env_dpdk.so 00:06:59.202 CC lib/rpc/rpc.o 00:06:59.461 LIB libspdk_rpc.a 00:06:59.461 SO libspdk_rpc.so.6.0 00:06:59.461 SYMLINK libspdk_rpc.so 00:07:00.028 CC lib/trace/trace.o 00:07:00.028 CC lib/trace/trace_flags.o 00:07:00.028 CC lib/trace/trace_rpc.o 00:07:00.028 CC lib/notify/notify.o 00:07:00.028 CC lib/notify/notify_rpc.o 00:07:00.028 CC lib/keyring/keyring.o 00:07:00.028 CC lib/keyring/keyring_rpc.o 00:07:00.028 LIB libspdk_notify.a 00:07:00.028 LIB libspdk_trace.a 00:07:00.028 SO libspdk_notify.so.6.0 00:07:00.286 SO libspdk_trace.so.10.0 00:07:00.286 LIB libspdk_keyring.a 00:07:00.286 SYMLINK libspdk_notify.so 00:07:00.286 SO libspdk_keyring.so.1.0 00:07:00.286 SYMLINK libspdk_trace.so 00:07:00.286 SYMLINK libspdk_keyring.so 00:07:00.544 CC lib/thread/thread.o 00:07:00.544 CC lib/thread/iobuf.o 00:07:00.544 CC lib/sock/sock.o 00:07:00.544 CC lib/sock/sock_rpc.o 00:07:01.111 LIB libspdk_sock.a 00:07:01.111 SO libspdk_sock.so.10.0 00:07:01.111 SYMLINK libspdk_sock.so 00:07:01.741 CC lib/nvme/nvme_ctrlr_cmd.o 00:07:01.741 CC lib/nvme/nvme_ctrlr.o 00:07:01.741 CC lib/nvme/nvme_fabric.o 00:07:01.741 CC lib/nvme/nvme_ns_cmd.o 00:07:01.741 CC lib/nvme/nvme_ns.o 00:07:01.741 CC lib/nvme/nvme_pcie_common.o 00:07:01.741 CC lib/nvme/nvme_pcie.o 00:07:01.741 CC lib/nvme/nvme_qpair.o 00:07:01.741 CC lib/nvme/nvme.o 00:07:01.741 CC lib/nvme/nvme_quirks.o 00:07:01.741 CC lib/nvme/nvme_transport.o 00:07:01.741 CC lib/nvme/nvme_discovery.o 00:07:01.741 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:07:01.741 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:07:01.741 CC lib/nvme/nvme_tcp.o 00:07:01.741 CC lib/nvme/nvme_opal.o 00:07:01.741 CC lib/nvme/nvme_io_msg.o 00:07:01.741 CC lib/nvme/nvme_poll_group.o 00:07:01.741 CC lib/nvme/nvme_zns.o 00:07:01.741 CC lib/nvme/nvme_stubs.o 00:07:01.741 CC lib/nvme/nvme_auth.o 00:07:01.741 CC lib/nvme/nvme_cuse.o 00:07:01.741 CC lib/nvme/nvme_rdma.o 00:07:02.678 LIB libspdk_thread.a 00:07:02.678 SO libspdk_thread.so.10.1 00:07:02.678 SYMLINK libspdk_thread.so 00:07:03.246 CC lib/init/json_config.o 00:07:03.246 CC lib/init/subsystem_rpc.o 00:07:03.246 CC lib/init/subsystem.o 00:07:03.246 CC lib/init/rpc.o 00:07:03.246 CC lib/accel/accel.o 00:07:03.246 CC lib/accel/accel_rpc.o 00:07:03.246 CC lib/accel/accel_sw.o 00:07:03.246 CC lib/blob/blobstore.o 00:07:03.246 CC lib/blob/request.o 00:07:03.246 CC lib/blob/zeroes.o 00:07:03.246 CC lib/blob/blob_bs_dev.o 00:07:03.246 CC lib/virtio/virtio.o 00:07:03.246 CC lib/virtio/virtio_vhost_user.o 00:07:03.246 CC lib/virtio/virtio_vfio_user.o 00:07:03.246 CC lib/virtio/virtio_pci.o 00:07:03.506 LIB libspdk_virtio.a 00:07:03.506 LIB libspdk_init.a 00:07:03.506 SO libspdk_virtio.so.7.0 00:07:03.506 SO libspdk_init.so.5.0 00:07:03.506 SYMLINK libspdk_virtio.so 00:07:03.506 SYMLINK libspdk_init.so 00:07:03.765 LIB libspdk_accel.a 00:07:04.025 SO libspdk_accel.so.16.0 00:07:04.025 CC lib/event/app.o 00:07:04.025 CC lib/event/reactor.o 00:07:04.025 CC lib/event/log_rpc.o 00:07:04.025 CC lib/event/app_rpc.o 00:07:04.025 CC lib/event/scheduler_static.o 00:07:04.025 SYMLINK libspdk_accel.so 00:07:04.025 LIB libspdk_nvme.a 00:07:04.284 SO libspdk_nvme.so.13.1 00:07:04.284 CC lib/bdev/bdev.o 00:07:04.284 CC lib/bdev/bdev_rpc.o 00:07:04.284 CC lib/bdev/bdev_zone.o 00:07:04.284 CC lib/bdev/part.o 00:07:04.284 CC lib/bdev/scsi_nvme.o 00:07:04.544 LIB libspdk_event.a 00:07:04.544 SO libspdk_event.so.14.0 00:07:04.544 SYMLINK libspdk_nvme.so 00:07:04.544 SYMLINK libspdk_event.so 00:07:06.448 LIB libspdk_blob.a 00:07:06.448 SO libspdk_blob.so.11.0 00:07:06.448 SYMLINK libspdk_blob.so 00:07:06.708 CC lib/blobfs/blobfs.o 00:07:06.708 CC lib/blobfs/tree.o 00:07:06.708 CC lib/lvol/lvol.o 00:07:06.967 LIB libspdk_bdev.a 00:07:07.226 SO libspdk_bdev.so.16.0 00:07:07.226 SYMLINK libspdk_bdev.so 00:07:07.804 LIB libspdk_blobfs.a 00:07:07.804 LIB libspdk_lvol.a 00:07:07.804 CC lib/nvmf/ctrlr_discovery.o 00:07:07.804 CC lib/nvmf/ctrlr.o 00:07:07.804 CC lib/nvmf/ctrlr_bdev.o 00:07:07.804 CC lib/nvmf/subsystem.o 00:07:07.804 CC lib/nvmf/nvmf.o 00:07:07.804 CC lib/nvmf/nvmf_rpc.o 00:07:07.804 CC lib/ublk/ublk.o 00:07:07.804 CC lib/nvmf/transport.o 00:07:07.804 CC lib/ublk/ublk_rpc.o 00:07:07.804 CC lib/nvmf/tcp.o 00:07:07.804 CC lib/nvmf/stubs.o 00:07:07.804 CC lib/nvmf/rdma.o 00:07:07.804 CC lib/nvmf/mdns_server.o 00:07:07.804 CC lib/nvmf/auth.o 00:07:07.804 CC lib/scsi/dev.o 00:07:07.804 CC lib/scsi/lun.o 00:07:07.804 CC lib/scsi/port.o 00:07:07.804 CC lib/scsi/scsi_bdev.o 00:07:07.804 CC lib/scsi/scsi.o 00:07:07.804 CC lib/scsi/scsi_rpc.o 00:07:07.804 CC lib/scsi/scsi_pr.o 00:07:07.804 CC lib/ftl/ftl_core.o 00:07:07.804 CC lib/ftl/ftl_init.o 00:07:07.804 CC lib/scsi/task.o 00:07:07.804 CC lib/ftl/ftl_layout.o 00:07:07.804 CC lib/ftl/ftl_debug.o 00:07:07.804 CC lib/ftl/ftl_io.o 00:07:07.804 CC lib/nbd/nbd.o 00:07:07.804 CC lib/ftl/ftl_sb.o 00:07:07.804 SO libspdk_blobfs.so.10.0 00:07:07.804 CC lib/ftl/ftl_l2p.o 00:07:07.804 CC lib/ftl/ftl_l2p_flat.o 00:07:07.804 CC lib/ftl/ftl_nv_cache.o 00:07:07.804 CC lib/nbd/nbd_rpc.o 00:07:07.804 CC lib/ftl/ftl_band.o 00:07:07.804 SO libspdk_lvol.so.10.0 00:07:07.804 CC lib/ftl/ftl_band_ops.o 00:07:07.804 CC lib/ftl/ftl_writer.o 00:07:07.804 CC lib/ftl/ftl_reloc.o 00:07:07.804 CC lib/ftl/ftl_p2l.o 00:07:07.804 CC lib/ftl/ftl_rq.o 00:07:07.804 CC lib/ftl/ftl_l2p_cache.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_startup.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_md.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_misc.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_band.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:07:07.804 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:07:07.804 CC lib/ftl/utils/ftl_conf.o 00:07:07.804 CC lib/ftl/utils/ftl_mempool.o 00:07:07.804 CC lib/ftl/utils/ftl_md.o 00:07:07.804 CC lib/ftl/utils/ftl_bitmap.o 00:07:07.804 CC lib/ftl/utils/ftl_property.o 00:07:07.804 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:07:07.804 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:07:07.804 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:07:07.804 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:07:07.804 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:07:07.804 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:07:07.804 CC lib/ftl/upgrade/ftl_sb_v3.o 00:07:07.804 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:07:07.804 CC lib/ftl/upgrade/ftl_sb_v5.o 00:07:07.804 CC lib/ftl/nvc/ftl_nvc_dev.o 00:07:07.804 SYMLINK libspdk_lvol.so 00:07:07.804 SYMLINK libspdk_blobfs.so 00:07:07.804 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:07:07.804 CC lib/ftl/base/ftl_base_dev.o 00:07:08.063 CC lib/ftl/base/ftl_base_bdev.o 00:07:08.063 CC lib/ftl/ftl_trace.o 00:07:08.322 LIB libspdk_nbd.a 00:07:08.322 SO libspdk_nbd.so.7.0 00:07:08.581 SYMLINK libspdk_nbd.so 00:07:08.581 LIB libspdk_ublk.a 00:07:08.839 SO libspdk_ublk.so.3.0 00:07:08.839 SYMLINK libspdk_ublk.so 00:07:09.098 LIB libspdk_ftl.a 00:07:09.098 LIB libspdk_scsi.a 00:07:09.098 SO libspdk_scsi.so.9.0 00:07:09.357 SO libspdk_ftl.so.9.0 00:07:09.357 SYMLINK libspdk_scsi.so 00:07:09.616 CC lib/iscsi/conn.o 00:07:09.616 CC lib/vhost/vhost.o 00:07:09.616 CC lib/iscsi/init_grp.o 00:07:09.616 CC lib/iscsi/iscsi.o 00:07:09.616 CC lib/vhost/vhost_rpc.o 00:07:09.616 CC lib/iscsi/md5.o 00:07:09.616 CC lib/vhost/vhost_scsi.o 00:07:09.616 CC lib/iscsi/param.o 00:07:09.616 CC lib/vhost/vhost_blk.o 00:07:09.616 CC lib/vhost/rte_vhost_user.o 00:07:09.616 CC lib/iscsi/portal_grp.o 00:07:09.616 CC lib/iscsi/tgt_node.o 00:07:09.616 CC lib/iscsi/iscsi_subsystem.o 00:07:09.616 CC lib/iscsi/iscsi_rpc.o 00:07:09.616 CC lib/iscsi/task.o 00:07:09.875 SYMLINK libspdk_ftl.so 00:07:10.134 LIB libspdk_nvmf.a 00:07:10.134 SO libspdk_nvmf.so.19.0 00:07:10.393 SYMLINK libspdk_nvmf.so 00:07:10.968 LIB libspdk_vhost.a 00:07:10.968 SO libspdk_vhost.so.8.0 00:07:10.968 SYMLINK libspdk_vhost.so 00:07:11.227 LIB libspdk_iscsi.a 00:07:11.227 SO libspdk_iscsi.so.8.0 00:07:11.488 SYMLINK libspdk_iscsi.so 00:07:12.057 CC module/env_dpdk/env_dpdk_rpc.o 00:07:12.057 CC module/accel/error/accel_error.o 00:07:12.057 CC module/accel/error/accel_error_rpc.o 00:07:12.057 CC module/accel/iaa/accel_iaa.o 00:07:12.057 CC module/accel/iaa/accel_iaa_rpc.o 00:07:12.057 CC module/keyring/file/keyring.o 00:07:12.057 CC module/keyring/file/keyring_rpc.o 00:07:12.057 CC module/keyring/linux/keyring.o 00:07:12.057 CC module/keyring/linux/keyring_rpc.o 00:07:12.057 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:07:12.057 CC module/scheduler/dynamic/scheduler_dynamic.o 00:07:12.057 LIB libspdk_env_dpdk_rpc.a 00:07:12.057 CC module/sock/posix/posix.o 00:07:12.057 CC module/accel/ioat/accel_ioat.o 00:07:12.057 CC module/accel/ioat/accel_ioat_rpc.o 00:07:12.057 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:07:12.057 CC module/blob/bdev/blob_bdev.o 00:07:12.057 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:07:12.057 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:07:12.057 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:07:12.057 CC module/accel/dsa/accel_dsa.o 00:07:12.057 CC module/accel/dsa/accel_dsa_rpc.o 00:07:12.057 CC module/scheduler/gscheduler/gscheduler.o 00:07:12.316 SO libspdk_env_dpdk_rpc.so.6.0 00:07:12.316 SYMLINK libspdk_env_dpdk_rpc.so 00:07:12.316 LIB libspdk_keyring_file.a 00:07:12.316 LIB libspdk_keyring_linux.a 00:07:12.316 LIB libspdk_scheduler_dpdk_governor.a 00:07:12.316 LIB libspdk_accel_ioat.a 00:07:12.316 LIB libspdk_accel_error.a 00:07:12.316 SO libspdk_keyring_file.so.1.0 00:07:12.316 SO libspdk_keyring_linux.so.1.0 00:07:12.316 LIB libspdk_scheduler_gscheduler.a 00:07:12.316 LIB libspdk_accel_iaa.a 00:07:12.316 SO libspdk_scheduler_dpdk_governor.so.4.0 00:07:12.316 LIB libspdk_scheduler_dynamic.a 00:07:12.316 SO libspdk_accel_ioat.so.6.0 00:07:12.316 SO libspdk_accel_error.so.2.0 00:07:12.316 SO libspdk_scheduler_gscheduler.so.4.0 00:07:12.576 SO libspdk_accel_iaa.so.3.0 00:07:12.576 SYMLINK libspdk_keyring_file.so 00:07:12.576 SYMLINK libspdk_keyring_linux.so 00:07:12.576 LIB libspdk_accel_dsa.a 00:07:12.576 SO libspdk_scheduler_dynamic.so.4.0 00:07:12.576 SYMLINK libspdk_scheduler_dpdk_governor.so 00:07:12.576 LIB libspdk_blob_bdev.a 00:07:12.576 SYMLINK libspdk_accel_ioat.so 00:07:12.576 SYMLINK libspdk_accel_error.so 00:07:12.576 SO libspdk_accel_dsa.so.5.0 00:07:12.576 SYMLINK libspdk_accel_iaa.so 00:07:12.576 SO libspdk_blob_bdev.so.11.0 00:07:12.576 SYMLINK libspdk_scheduler_gscheduler.so 00:07:12.576 SYMLINK libspdk_scheduler_dynamic.so 00:07:12.576 SYMLINK libspdk_accel_dsa.so 00:07:12.576 SYMLINK libspdk_blob_bdev.so 00:07:12.836 LIB libspdk_sock_posix.a 00:07:12.836 SO libspdk_sock_posix.so.6.0 00:07:12.836 SYMLINK libspdk_sock_posix.so 00:07:13.095 CC module/bdev/error/vbdev_error.o 00:07:13.095 CC module/bdev/error/vbdev_error_rpc.o 00:07:13.095 CC module/bdev/gpt/gpt.o 00:07:13.095 CC module/bdev/gpt/vbdev_gpt.o 00:07:13.095 CC module/bdev/null/bdev_null.o 00:07:13.095 CC module/bdev/null/bdev_null_rpc.o 00:07:13.095 CC module/bdev/raid/bdev_raid.o 00:07:13.095 CC module/bdev/raid/bdev_raid_rpc.o 00:07:13.095 CC module/bdev/raid/bdev_raid_sb.o 00:07:13.095 CC module/bdev/raid/raid0.o 00:07:13.095 CC module/bdev/raid/raid1.o 00:07:13.095 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:07:13.095 CC module/bdev/raid/concat.o 00:07:13.095 CC module/blobfs/bdev/blobfs_bdev.o 00:07:13.095 CC module/bdev/lvol/vbdev_lvol.o 00:07:13.095 CC module/bdev/delay/vbdev_delay.o 00:07:13.095 CC module/bdev/delay/vbdev_delay_rpc.o 00:07:13.095 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:07:13.095 CC module/bdev/virtio/bdev_virtio_scsi.o 00:07:13.095 CC module/bdev/aio/bdev_aio.o 00:07:13.095 CC module/bdev/compress/vbdev_compress.o 00:07:13.095 CC module/bdev/compress/vbdev_compress_rpc.o 00:07:13.095 CC module/bdev/virtio/bdev_virtio_blk.o 00:07:13.095 CC module/bdev/split/vbdev_split_rpc.o 00:07:13.095 CC module/bdev/virtio/bdev_virtio_rpc.o 00:07:13.095 CC module/bdev/malloc/bdev_malloc.o 00:07:13.095 CC module/bdev/split/vbdev_split.o 00:07:13.095 CC module/bdev/aio/bdev_aio_rpc.o 00:07:13.096 CC module/bdev/iscsi/bdev_iscsi.o 00:07:13.096 CC module/bdev/nvme/bdev_nvme.o 00:07:13.096 CC module/bdev/malloc/bdev_malloc_rpc.o 00:07:13.096 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:07:13.096 CC module/bdev/nvme/nvme_rpc.o 00:07:13.096 CC module/bdev/nvme/bdev_nvme_rpc.o 00:07:13.096 CC module/bdev/passthru/vbdev_passthru.o 00:07:13.096 CC module/bdev/nvme/bdev_mdns_client.o 00:07:13.096 CC module/bdev/nvme/vbdev_opal.o 00:07:13.096 CC module/bdev/ftl/bdev_ftl.o 00:07:13.096 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:07:13.096 CC module/bdev/nvme/vbdev_opal_rpc.o 00:07:13.096 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:07:13.096 CC module/bdev/ftl/bdev_ftl_rpc.o 00:07:13.096 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:07:13.096 CC module/bdev/crypto/vbdev_crypto.o 00:07:13.096 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:07:13.096 CC module/bdev/zone_block/vbdev_zone_block.o 00:07:13.355 LIB libspdk_accel_dpdk_compressdev.a 00:07:13.355 SO libspdk_accel_dpdk_compressdev.so.3.0 00:07:13.355 SYMLINK libspdk_accel_dpdk_compressdev.so 00:07:13.355 LIB libspdk_bdev_error.a 00:07:13.355 LIB libspdk_bdev_null.a 00:07:13.614 SO libspdk_bdev_error.so.6.0 00:07:13.614 LIB libspdk_blobfs_bdev.a 00:07:13.614 SO libspdk_bdev_null.so.6.0 00:07:13.614 SO libspdk_blobfs_bdev.so.6.0 00:07:13.614 LIB libspdk_bdev_gpt.a 00:07:13.614 SYMLINK libspdk_bdev_error.so 00:07:13.614 LIB libspdk_bdev_compress.a 00:07:13.614 SYMLINK libspdk_bdev_null.so 00:07:13.614 LIB libspdk_bdev_malloc.a 00:07:13.614 SYMLINK libspdk_blobfs_bdev.so 00:07:13.614 SO libspdk_bdev_gpt.so.6.0 00:07:13.614 SO libspdk_bdev_compress.so.6.0 00:07:13.614 SO libspdk_bdev_malloc.so.6.0 00:07:13.614 LIB libspdk_bdev_split.a 00:07:13.614 LIB libspdk_bdev_virtio.a 00:07:13.614 LIB libspdk_bdev_aio.a 00:07:13.614 SO libspdk_bdev_split.so.6.0 00:07:13.614 LIB libspdk_bdev_ftl.a 00:07:13.614 SYMLINK libspdk_bdev_gpt.so 00:07:13.614 SO libspdk_bdev_aio.so.6.0 00:07:13.614 SO libspdk_bdev_virtio.so.6.0 00:07:13.614 LIB libspdk_accel_dpdk_cryptodev.a 00:07:13.614 LIB libspdk_bdev_delay.a 00:07:13.614 SYMLINK libspdk_bdev_malloc.so 00:07:13.614 SYMLINK libspdk_bdev_compress.so 00:07:13.614 LIB libspdk_bdev_zone_block.a 00:07:13.614 SO libspdk_bdev_ftl.so.6.0 00:07:13.872 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:07:13.872 SO libspdk_bdev_delay.so.6.0 00:07:13.872 LIB libspdk_bdev_iscsi.a 00:07:13.872 SYMLINK libspdk_bdev_split.so 00:07:13.872 SO libspdk_bdev_zone_block.so.6.0 00:07:13.872 SYMLINK libspdk_bdev_aio.so 00:07:13.872 SYMLINK libspdk_bdev_virtio.so 00:07:13.872 SO libspdk_bdev_iscsi.so.6.0 00:07:13.872 SYMLINK libspdk_bdev_ftl.so 00:07:13.872 SYMLINK libspdk_bdev_delay.so 00:07:13.872 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:07:13.872 LIB libspdk_bdev_lvol.a 00:07:13.872 SYMLINK libspdk_bdev_zone_block.so 00:07:13.872 SYMLINK libspdk_bdev_iscsi.so 00:07:13.872 SO libspdk_bdev_lvol.so.6.0 00:07:13.872 LIB libspdk_bdev_passthru.a 00:07:13.872 LIB libspdk_bdev_crypto.a 00:07:14.132 SYMLINK libspdk_bdev_lvol.so 00:07:14.132 SO libspdk_bdev_passthru.so.6.0 00:07:14.132 SO libspdk_bdev_crypto.so.6.0 00:07:14.132 SYMLINK libspdk_bdev_crypto.so 00:07:14.132 SYMLINK libspdk_bdev_passthru.so 00:07:14.132 LIB libspdk_bdev_raid.a 00:07:14.392 SO libspdk_bdev_raid.so.6.0 00:07:14.392 SYMLINK libspdk_bdev_raid.so 00:07:14.961 LIB libspdk_bdev_nvme.a 00:07:14.961 SO libspdk_bdev_nvme.so.7.0 00:07:15.221 SYMLINK libspdk_bdev_nvme.so 00:07:16.165 CC module/event/subsystems/sock/sock.o 00:07:16.165 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:07:16.165 CC module/event/subsystems/iobuf/iobuf.o 00:07:16.165 CC module/event/subsystems/keyring/keyring.o 00:07:16.165 CC module/event/subsystems/scheduler/scheduler.o 00:07:16.165 CC module/event/subsystems/vmd/vmd.o 00:07:16.165 CC module/event/subsystems/vmd/vmd_rpc.o 00:07:16.165 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:07:16.165 LIB libspdk_event_sock.a 00:07:16.165 LIB libspdk_event_scheduler.a 00:07:16.165 LIB libspdk_event_vmd.a 00:07:16.165 LIB libspdk_event_vhost_blk.a 00:07:16.165 LIB libspdk_event_iobuf.a 00:07:16.165 SO libspdk_event_sock.so.5.0 00:07:16.165 SO libspdk_event_vmd.so.6.0 00:07:16.165 SO libspdk_event_scheduler.so.4.0 00:07:16.165 SO libspdk_event_vhost_blk.so.3.0 00:07:16.165 SO libspdk_event_iobuf.so.3.0 00:07:16.165 SYMLINK libspdk_event_sock.so 00:07:16.165 SYMLINK libspdk_event_scheduler.so 00:07:16.165 LIB libspdk_event_keyring.a 00:07:16.165 SYMLINK libspdk_event_vhost_blk.so 00:07:16.165 SYMLINK libspdk_event_iobuf.so 00:07:16.165 SYMLINK libspdk_event_vmd.so 00:07:16.440 SO libspdk_event_keyring.so.1.0 00:07:16.440 SYMLINK libspdk_event_keyring.so 00:07:16.712 CC module/event/subsystems/accel/accel.o 00:07:16.712 LIB libspdk_event_accel.a 00:07:16.972 SO libspdk_event_accel.so.6.0 00:07:16.972 SYMLINK libspdk_event_accel.so 00:07:17.231 CC module/event/subsystems/bdev/bdev.o 00:07:17.489 LIB libspdk_event_bdev.a 00:07:17.489 SO libspdk_event_bdev.so.6.0 00:07:17.748 SYMLINK libspdk_event_bdev.so 00:07:18.007 CC module/event/subsystems/nbd/nbd.o 00:07:18.007 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:07:18.007 CC module/event/subsystems/ublk/ublk.o 00:07:18.007 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:07:18.007 CC module/event/subsystems/scsi/scsi.o 00:07:18.267 LIB libspdk_event_nbd.a 00:07:18.267 LIB libspdk_event_scsi.a 00:07:18.267 LIB libspdk_event_ublk.a 00:07:18.267 SO libspdk_event_nbd.so.6.0 00:07:18.267 SO libspdk_event_scsi.so.6.0 00:07:18.267 SO libspdk_event_ublk.so.3.0 00:07:18.267 SYMLINK libspdk_event_nbd.so 00:07:18.267 SYMLINK libspdk_event_scsi.so 00:07:18.267 SYMLINK libspdk_event_ublk.so 00:07:18.526 LIB libspdk_event_nvmf.a 00:07:18.526 SO libspdk_event_nvmf.so.6.0 00:07:18.785 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:07:18.785 CC module/event/subsystems/iscsi/iscsi.o 00:07:18.785 SYMLINK libspdk_event_nvmf.so 00:07:18.785 LIB libspdk_event_vhost_scsi.a 00:07:18.785 LIB libspdk_event_iscsi.a 00:07:18.785 SO libspdk_event_vhost_scsi.so.3.0 00:07:19.045 SO libspdk_event_iscsi.so.6.0 00:07:19.045 SYMLINK libspdk_event_iscsi.so 00:07:19.045 SYMLINK libspdk_event_vhost_scsi.so 00:07:19.305 SO libspdk.so.6.0 00:07:19.305 SYMLINK libspdk.so 00:07:19.565 CC app/spdk_top/spdk_top.o 00:07:19.565 CC app/trace_record/trace_record.o 00:07:19.565 CXX app/trace/trace.o 00:07:19.565 CC test/rpc_client/rpc_client_test.o 00:07:19.565 CC app/spdk_nvme_discover/discovery_aer.o 00:07:19.565 CC app/spdk_nvme_identify/identify.o 00:07:19.565 CC app/spdk_lspci/spdk_lspci.o 00:07:19.565 TEST_HEADER include/spdk/accel.h 00:07:19.565 TEST_HEADER include/spdk/accel_module.h 00:07:19.565 TEST_HEADER include/spdk/assert.h 00:07:19.565 TEST_HEADER include/spdk/barrier.h 00:07:19.565 TEST_HEADER include/spdk/base64.h 00:07:19.565 TEST_HEADER include/spdk/bdev_module.h 00:07:19.565 TEST_HEADER include/spdk/bdev.h 00:07:19.565 TEST_HEADER include/spdk/bdev_zone.h 00:07:19.565 TEST_HEADER include/spdk/bit_array.h 00:07:19.565 TEST_HEADER include/spdk/blob_bdev.h 00:07:19.565 TEST_HEADER include/spdk/bit_pool.h 00:07:19.565 TEST_HEADER include/spdk/blobfs_bdev.h 00:07:19.565 CC app/spdk_nvme_perf/perf.o 00:07:19.565 TEST_HEADER include/spdk/blobfs.h 00:07:19.565 TEST_HEADER include/spdk/blob.h 00:07:19.565 CC examples/interrupt_tgt/interrupt_tgt.o 00:07:19.565 TEST_HEADER include/spdk/config.h 00:07:19.565 TEST_HEADER include/spdk/conf.h 00:07:19.565 TEST_HEADER include/spdk/crc16.h 00:07:19.565 TEST_HEADER include/spdk/cpuset.h 00:07:19.565 TEST_HEADER include/spdk/dif.h 00:07:19.565 TEST_HEADER include/spdk/crc64.h 00:07:19.565 TEST_HEADER include/spdk/dma.h 00:07:19.565 TEST_HEADER include/spdk/crc32.h 00:07:19.565 TEST_HEADER include/spdk/endian.h 00:07:19.565 TEST_HEADER include/spdk/env.h 00:07:19.565 TEST_HEADER include/spdk/event.h 00:07:19.565 TEST_HEADER include/spdk/env_dpdk.h 00:07:19.565 TEST_HEADER include/spdk/fd_group.h 00:07:19.565 TEST_HEADER include/spdk/fd.h 00:07:19.565 TEST_HEADER include/spdk/file.h 00:07:19.565 TEST_HEADER include/spdk/ftl.h 00:07:19.565 TEST_HEADER include/spdk/gpt_spec.h 00:07:19.565 TEST_HEADER include/spdk/histogram_data.h 00:07:19.565 TEST_HEADER include/spdk/idxd.h 00:07:19.565 TEST_HEADER include/spdk/hexlify.h 00:07:19.565 TEST_HEADER include/spdk/idxd_spec.h 00:07:19.565 TEST_HEADER include/spdk/init.h 00:07:19.565 TEST_HEADER include/spdk/ioat.h 00:07:19.565 TEST_HEADER include/spdk/ioat_spec.h 00:07:19.565 TEST_HEADER include/spdk/json.h 00:07:19.565 TEST_HEADER include/spdk/jsonrpc.h 00:07:19.565 TEST_HEADER include/spdk/iscsi_spec.h 00:07:19.565 TEST_HEADER include/spdk/keyring_module.h 00:07:19.565 TEST_HEADER include/spdk/keyring.h 00:07:19.565 TEST_HEADER include/spdk/likely.h 00:07:19.565 TEST_HEADER include/spdk/lvol.h 00:07:19.565 TEST_HEADER include/spdk/log.h 00:07:19.565 TEST_HEADER include/spdk/mmio.h 00:07:19.565 TEST_HEADER include/spdk/nbd.h 00:07:19.565 CC app/spdk_dd/spdk_dd.o 00:07:19.565 TEST_HEADER include/spdk/memory.h 00:07:19.565 TEST_HEADER include/spdk/net.h 00:07:19.565 CC app/iscsi_tgt/iscsi_tgt.o 00:07:19.565 TEST_HEADER include/spdk/nvme_intel.h 00:07:19.565 TEST_HEADER include/spdk/nvme.h 00:07:19.565 TEST_HEADER include/spdk/notify.h 00:07:19.565 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:07:19.565 TEST_HEADER include/spdk/nvme_spec.h 00:07:19.565 TEST_HEADER include/spdk/nvme_zns.h 00:07:19.565 TEST_HEADER include/spdk/nvmf_cmd.h 00:07:19.565 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:07:19.565 TEST_HEADER include/spdk/nvme_ocssd.h 00:07:19.565 TEST_HEADER include/spdk/nvmf.h 00:07:19.565 TEST_HEADER include/spdk/nvmf_transport.h 00:07:19.565 CC app/spdk_tgt/spdk_tgt.o 00:07:19.565 TEST_HEADER include/spdk/opal_spec.h 00:07:19.565 TEST_HEADER include/spdk/nvmf_spec.h 00:07:19.565 TEST_HEADER include/spdk/pci_ids.h 00:07:19.565 TEST_HEADER include/spdk/pipe.h 00:07:19.831 TEST_HEADER include/spdk/queue.h 00:07:19.831 TEST_HEADER include/spdk/reduce.h 00:07:19.831 TEST_HEADER include/spdk/opal.h 00:07:19.831 CC app/nvmf_tgt/nvmf_main.o 00:07:19.831 TEST_HEADER include/spdk/rpc.h 00:07:19.831 TEST_HEADER include/spdk/scheduler.h 00:07:19.831 TEST_HEADER include/spdk/scsi.h 00:07:19.831 TEST_HEADER include/spdk/scsi_spec.h 00:07:19.831 TEST_HEADER include/spdk/sock.h 00:07:19.831 TEST_HEADER include/spdk/stdinc.h 00:07:19.831 TEST_HEADER include/spdk/string.h 00:07:19.831 TEST_HEADER include/spdk/thread.h 00:07:19.831 TEST_HEADER include/spdk/trace.h 00:07:19.831 TEST_HEADER include/spdk/trace_parser.h 00:07:19.831 TEST_HEADER include/spdk/tree.h 00:07:19.831 TEST_HEADER include/spdk/ublk.h 00:07:19.831 TEST_HEADER include/spdk/util.h 00:07:19.831 TEST_HEADER include/spdk/uuid.h 00:07:19.831 TEST_HEADER include/spdk/version.h 00:07:19.831 TEST_HEADER include/spdk/vfio_user_pci.h 00:07:19.831 TEST_HEADER include/spdk/vfio_user_spec.h 00:07:19.831 TEST_HEADER include/spdk/vhost.h 00:07:19.831 TEST_HEADER include/spdk/vmd.h 00:07:19.831 TEST_HEADER include/spdk/xor.h 00:07:19.831 TEST_HEADER include/spdk/zipf.h 00:07:19.831 CXX test/cpp_headers/accel.o 00:07:19.831 CXX test/cpp_headers/accel_module.o 00:07:19.831 CXX test/cpp_headers/assert.o 00:07:19.831 CXX test/cpp_headers/base64.o 00:07:19.831 CXX test/cpp_headers/barrier.o 00:07:19.831 CXX test/cpp_headers/bdev.o 00:07:19.831 CXX test/cpp_headers/bdev_zone.o 00:07:19.831 CXX test/cpp_headers/bit_array.o 00:07:19.831 CXX test/cpp_headers/bit_pool.o 00:07:19.831 CXX test/cpp_headers/blob_bdev.o 00:07:19.831 CXX test/cpp_headers/bdev_module.o 00:07:19.831 CXX test/cpp_headers/blobfs_bdev.o 00:07:19.831 CXX test/cpp_headers/blobfs.o 00:07:19.831 CXX test/cpp_headers/conf.o 00:07:19.831 CXX test/cpp_headers/blob.o 00:07:19.831 CXX test/cpp_headers/config.o 00:07:19.831 CXX test/cpp_headers/crc16.o 00:07:19.831 CXX test/cpp_headers/crc32.o 00:07:19.831 CXX test/cpp_headers/cpuset.o 00:07:19.831 CXX test/cpp_headers/crc64.o 00:07:19.831 CXX test/cpp_headers/dma.o 00:07:19.831 CXX test/cpp_headers/dif.o 00:07:19.831 CXX test/cpp_headers/endian.o 00:07:19.831 CXX test/cpp_headers/env.o 00:07:19.831 CXX test/cpp_headers/event.o 00:07:19.831 CC test/env/vtophys/vtophys.o 00:07:19.831 CXX test/cpp_headers/env_dpdk.o 00:07:19.831 CXX test/cpp_headers/fd_group.o 00:07:19.831 CXX test/cpp_headers/file.o 00:07:19.831 CXX test/cpp_headers/fd.o 00:07:19.831 CXX test/cpp_headers/ftl.o 00:07:19.831 CXX test/cpp_headers/gpt_spec.o 00:07:19.831 CXX test/cpp_headers/histogram_data.o 00:07:19.831 CC examples/util/zipf/zipf.o 00:07:19.831 CXX test/cpp_headers/init.o 00:07:19.831 CXX test/cpp_headers/hexlify.o 00:07:19.831 CXX test/cpp_headers/ioat.o 00:07:19.831 CXX test/cpp_headers/ioat_spec.o 00:07:19.831 CXX test/cpp_headers/idxd.o 00:07:19.831 CXX test/cpp_headers/idxd_spec.o 00:07:19.831 CXX test/cpp_headers/iscsi_spec.o 00:07:19.831 CC examples/ioat/perf/perf.o 00:07:19.831 CXX test/cpp_headers/json.o 00:07:19.831 CXX test/cpp_headers/jsonrpc.o 00:07:19.831 CC examples/ioat/verify/verify.o 00:07:19.831 CXX test/cpp_headers/keyring.o 00:07:19.831 CC test/thread/poller_perf/poller_perf.o 00:07:19.831 CC test/env/memory/memory_ut.o 00:07:19.831 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:07:19.831 CC test/env/pci/pci_ut.o 00:07:19.831 CXX test/cpp_headers/keyring_module.o 00:07:19.831 CC test/app/jsoncat/jsoncat.o 00:07:19.831 CC test/app/stub/stub.o 00:07:19.831 CC app/fio/nvme/fio_plugin.o 00:07:19.831 CC test/app/histogram_perf/histogram_perf.o 00:07:19.831 CC test/dma/test_dma/test_dma.o 00:07:19.831 CC app/fio/bdev/fio_plugin.o 00:07:19.831 CC test/app/bdev_svc/bdev_svc.o 00:07:20.100 LINK spdk_lspci 00:07:20.100 LINK spdk_nvme_discover 00:07:20.100 LINK rpc_client_test 00:07:20.100 LINK zipf 00:07:20.100 LINK spdk_trace_record 00:07:20.100 CC test/env/mem_callbacks/mem_callbacks.o 00:07:20.100 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:07:20.366 LINK vtophys 00:07:20.366 LINK poller_perf 00:07:20.366 LINK iscsi_tgt 00:07:20.366 LINK spdk_tgt 00:07:20.366 LINK jsoncat 00:07:20.366 LINK env_dpdk_post_init 00:07:20.366 LINK nvmf_tgt 00:07:20.366 CXX test/cpp_headers/likely.o 00:07:20.366 LINK interrupt_tgt 00:07:20.366 CXX test/cpp_headers/log.o 00:07:20.366 CXX test/cpp_headers/lvol.o 00:07:20.366 CXX test/cpp_headers/memory.o 00:07:20.366 CXX test/cpp_headers/mmio.o 00:07:20.366 CXX test/cpp_headers/nbd.o 00:07:20.366 CXX test/cpp_headers/net.o 00:07:20.366 CXX test/cpp_headers/notify.o 00:07:20.366 CXX test/cpp_headers/nvme.o 00:07:20.366 CXX test/cpp_headers/nvme_intel.o 00:07:20.366 CXX test/cpp_headers/nvme_ocssd.o 00:07:20.366 CXX test/cpp_headers/nvme_ocssd_spec.o 00:07:20.366 LINK stub 00:07:20.366 LINK histogram_perf 00:07:20.366 CXX test/cpp_headers/nvme_spec.o 00:07:20.366 CXX test/cpp_headers/nvme_zns.o 00:07:20.366 CXX test/cpp_headers/nvmf_cmd.o 00:07:20.366 CXX test/cpp_headers/nvmf_fc_spec.o 00:07:20.366 LINK bdev_svc 00:07:20.366 CXX test/cpp_headers/nvmf.o 00:07:20.366 CXX test/cpp_headers/nvmf_spec.o 00:07:20.366 CXX test/cpp_headers/nvmf_transport.o 00:07:20.366 CXX test/cpp_headers/opal.o 00:07:20.366 CXX test/cpp_headers/pci_ids.o 00:07:20.366 CXX test/cpp_headers/opal_spec.o 00:07:20.366 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:07:20.366 CXX test/cpp_headers/pipe.o 00:07:20.366 CXX test/cpp_headers/queue.o 00:07:20.366 CXX test/cpp_headers/reduce.o 00:07:20.366 LINK ioat_perf 00:07:20.366 CXX test/cpp_headers/rpc.o 00:07:20.366 CXX test/cpp_headers/scheduler.o 00:07:20.366 CXX test/cpp_headers/scsi.o 00:07:20.366 CXX test/cpp_headers/scsi_spec.o 00:07:20.366 CXX test/cpp_headers/sock.o 00:07:20.366 CXX test/cpp_headers/stdinc.o 00:07:20.366 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:07:20.366 CXX test/cpp_headers/string.o 00:07:20.366 CXX test/cpp_headers/thread.o 00:07:20.366 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:07:20.366 CXX test/cpp_headers/trace_parser.o 00:07:20.366 CXX test/cpp_headers/trace.o 00:07:20.366 CXX test/cpp_headers/tree.o 00:07:20.626 CXX test/cpp_headers/util.o 00:07:20.626 CXX test/cpp_headers/ublk.o 00:07:20.626 CXX test/cpp_headers/uuid.o 00:07:20.626 LINK mem_callbacks 00:07:20.626 CXX test/cpp_headers/version.o 00:07:20.626 LINK spdk_trace 00:07:20.626 LINK spdk_dd 00:07:20.626 CXX test/cpp_headers/vfio_user_pci.o 00:07:20.626 LINK pci_ut 00:07:20.626 CXX test/cpp_headers/vfio_user_spec.o 00:07:20.626 CXX test/cpp_headers/vhost.o 00:07:20.626 CXX test/cpp_headers/vmd.o 00:07:20.626 CXX test/cpp_headers/xor.o 00:07:20.626 CXX test/cpp_headers/zipf.o 00:07:20.888 LINK verify 00:07:20.888 CC examples/sock/hello_world/hello_sock.o 00:07:20.888 CC examples/vmd/lsvmd/lsvmd.o 00:07:20.888 CC examples/vmd/led/led.o 00:07:20.888 CC test/event/event_perf/event_perf.o 00:07:20.888 CC test/event/reactor/reactor.o 00:07:20.888 CC examples/thread/thread/thread_ex.o 00:07:20.888 CC test/event/reactor_perf/reactor_perf.o 00:07:21.147 CC test/event/app_repeat/app_repeat.o 00:07:21.147 CC examples/idxd/perf/perf.o 00:07:21.147 LINK spdk_bdev 00:07:21.147 LINK nvme_fuzz 00:07:21.147 LINK spdk_nvme 00:07:21.147 CC test/event/scheduler/scheduler.o 00:07:21.147 LINK vhost_fuzz 00:07:21.147 LINK memory_ut 00:07:21.147 LINK event_perf 00:07:21.147 LINK lsvmd 00:07:21.147 LINK led 00:07:21.147 LINK reactor 00:07:21.147 CC app/vhost/vhost.o 00:07:21.147 LINK reactor_perf 00:07:21.147 LINK spdk_nvme_perf 00:07:21.147 LINK app_repeat 00:07:21.147 LINK spdk_top 00:07:21.147 LINK hello_sock 00:07:21.406 LINK thread 00:07:21.406 LINK test_dma 00:07:21.406 LINK scheduler 00:07:21.406 LINK idxd_perf 00:07:21.406 LINK vhost 00:07:21.974 CC test/nvme/err_injection/err_injection.o 00:07:21.974 CC test/nvme/aer/aer.o 00:07:21.974 CC test/nvme/e2edp/nvme_dp.o 00:07:21.974 CC test/nvme/boot_partition/boot_partition.o 00:07:21.974 CC test/nvme/overhead/overhead.o 00:07:21.974 CC test/nvme/connect_stress/connect_stress.o 00:07:21.974 CC test/nvme/compliance/nvme_compliance.o 00:07:21.974 CC test/nvme/startup/startup.o 00:07:21.974 CC test/nvme/fused_ordering/fused_ordering.o 00:07:21.974 CC test/nvme/simple_copy/simple_copy.o 00:07:21.974 CC test/nvme/reset/reset.o 00:07:21.974 CC test/nvme/cuse/cuse.o 00:07:21.974 CC test/nvme/fdp/fdp.o 00:07:21.974 CC test/nvme/reserve/reserve.o 00:07:21.974 CC test/nvme/sgl/sgl.o 00:07:21.974 CC test/nvme/doorbell_aers/doorbell_aers.o 00:07:21.974 CC examples/accel/perf/accel_perf.o 00:07:21.974 CC test/accel/dif/dif.o 00:07:21.974 CC test/blobfs/mkfs/mkfs.o 00:07:21.974 CC examples/blob/hello_world/hello_blob.o 00:07:21.974 CC examples/blob/cli/blobcli.o 00:07:21.974 CC examples/nvme/hotplug/hotplug.o 00:07:21.974 CC examples/nvme/nvme_manage/nvme_manage.o 00:07:21.974 CC examples/nvme/hello_world/hello_world.o 00:07:21.974 CC test/lvol/esnap/esnap.o 00:07:21.974 CC examples/nvme/reconnect/reconnect.o 00:07:21.974 CC examples/nvme/arbitration/arbitration.o 00:07:21.974 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:07:21.974 CC examples/nvme/cmb_copy/cmb_copy.o 00:07:21.974 CC examples/nvme/abort/abort.o 00:07:22.234 LINK startup 00:07:22.234 LINK err_injection 00:07:22.234 LINK spdk_nvme_identify 00:07:22.234 LINK connect_stress 00:07:22.234 LINK boot_partition 00:07:22.234 LINK fused_ordering 00:07:22.234 LINK doorbell_aers 00:07:22.234 LINK reset 00:07:22.234 LINK simple_copy 00:07:22.234 LINK reserve 00:07:22.234 LINK aer 00:07:22.234 LINK nvme_dp 00:07:22.234 LINK sgl 00:07:22.234 LINK hello_blob 00:07:22.234 LINK mkfs 00:07:22.234 LINK overhead 00:07:22.234 LINK pmr_persistence 00:07:22.234 LINK nvme_compliance 00:07:22.234 LINK hello_world 00:07:22.234 LINK fdp 00:07:22.234 LINK hotplug 00:07:22.494 LINK abort 00:07:22.494 LINK reconnect 00:07:22.494 LINK arbitration 00:07:22.494 LINK accel_perf 00:07:22.494 LINK dif 00:07:22.494 LINK cmb_copy 00:07:22.494 LINK blobcli 00:07:22.494 LINK nvme_manage 00:07:22.753 LINK iscsi_fuzz 00:07:23.012 CC examples/bdev/hello_world/hello_bdev.o 00:07:23.012 CC examples/bdev/bdevperf/bdevperf.o 00:07:23.272 CC test/bdev/bdevio/bdevio.o 00:07:23.272 LINK cuse 00:07:23.531 LINK hello_bdev 00:07:23.531 LINK bdevio 00:07:23.790 LINK bdevperf 00:07:24.727 CC examples/nvmf/nvmf/nvmf.o 00:07:24.986 LINK nvmf 00:07:27.524 LINK esnap 00:07:27.524 00:07:27.524 real 0m49.566s 00:07:27.524 user 15m27.038s 00:07:27.524 sys 3m5.132s 00:07:27.524 17:01:22 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:07:27.524 17:01:22 make -- common/autotest_common.sh@10 -- $ set +x 00:07:27.524 ************************************ 00:07:27.525 END TEST make 00:07:27.525 ************************************ 00:07:27.525 17:01:22 -- common/autotest_common.sh@1142 -- $ return 0 00:07:27.525 17:01:22 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:07:27.525 17:01:22 -- pm/common@29 -- $ signal_monitor_resources TERM 00:07:27.525 17:01:22 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:07:27.525 17:01:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.525 17:01:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:07:27.525 17:01:22 -- pm/common@44 -- $ pid=3895234 00:07:27.525 17:01:22 -- pm/common@50 -- $ kill -TERM 3895234 00:07:27.525 17:01:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.525 17:01:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:07:27.525 17:01:22 -- pm/common@44 -- $ pid=3895236 00:07:27.525 17:01:22 -- pm/common@50 -- $ kill -TERM 3895236 00:07:27.525 17:01:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.525 17:01:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:07:27.525 17:01:22 -- pm/common@44 -- $ pid=3895237 00:07:27.525 17:01:22 -- pm/common@50 -- $ kill -TERM 3895237 00:07:27.525 17:01:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.525 17:01:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:07:27.525 17:01:22 -- pm/common@44 -- $ pid=3895260 00:07:27.525 17:01:22 -- pm/common@50 -- $ sudo -E kill -TERM 3895260 00:07:27.785 17:01:23 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:27.785 17:01:23 -- nvmf/common.sh@7 -- # uname -s 00:07:27.785 17:01:23 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:27.785 17:01:23 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:27.785 17:01:23 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:27.785 17:01:23 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:27.785 17:01:23 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:27.785 17:01:23 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:27.785 17:01:23 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:27.785 17:01:23 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:27.785 17:01:23 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:27.785 17:01:23 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:27.785 17:01:23 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:27.785 17:01:23 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:27.785 17:01:23 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:27.785 17:01:23 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:27.785 17:01:23 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:27.785 17:01:23 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:27.785 17:01:23 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:27.785 17:01:23 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:27.785 17:01:23 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:27.785 17:01:23 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:27.785 17:01:23 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.785 17:01:23 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.785 17:01:23 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.785 17:01:23 -- paths/export.sh@5 -- # export PATH 00:07:27.785 17:01:23 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:27.785 17:01:23 -- nvmf/common.sh@47 -- # : 0 00:07:27.785 17:01:23 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:27.785 17:01:23 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:27.785 17:01:23 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:27.785 17:01:23 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:27.785 17:01:23 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:27.785 17:01:23 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:27.785 17:01:23 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:27.785 17:01:23 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:27.785 17:01:23 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:07:27.785 17:01:23 -- spdk/autotest.sh@32 -- # uname -s 00:07:27.785 17:01:23 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:07:27.785 17:01:23 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:07:27.785 17:01:23 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:07:27.785 17:01:23 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:07:27.785 17:01:23 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:07:27.785 17:01:23 -- spdk/autotest.sh@44 -- # modprobe nbd 00:07:27.785 17:01:23 -- spdk/autotest.sh@46 -- # type -P udevadm 00:07:27.785 17:01:23 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:07:27.785 17:01:23 -- spdk/autotest.sh@48 -- # udevadm_pid=4017445 00:07:27.785 17:01:23 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:07:27.785 17:01:23 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:07:27.785 17:01:23 -- pm/common@17 -- # local monitor 00:07:27.785 17:01:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.785 17:01:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.785 17:01:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.785 17:01:23 -- pm/common@21 -- # date +%s 00:07:27.785 17:01:23 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:07:27.785 17:01:23 -- pm/common@21 -- # date +%s 00:07:27.785 17:01:23 -- pm/common@25 -- # sleep 1 00:07:27.785 17:01:23 -- pm/common@21 -- # date +%s 00:07:27.785 17:01:23 -- pm/common@21 -- # date +%s 00:07:27.785 17:01:23 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721746883 00:07:27.785 17:01:23 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721746883 00:07:27.785 17:01:23 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721746883 00:07:27.785 17:01:23 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721746883 00:07:27.785 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721746883_collect-vmstat.pm.log 00:07:27.785 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721746883_collect-cpu-load.pm.log 00:07:27.785 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721746883_collect-cpu-temp.pm.log 00:07:27.785 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721746883_collect-bmc-pm.bmc.pm.log 00:07:28.728 17:01:24 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:07:28.728 17:01:24 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:07:28.728 17:01:24 -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:28.728 17:01:24 -- common/autotest_common.sh@10 -- # set +x 00:07:28.728 17:01:24 -- spdk/autotest.sh@59 -- # create_test_list 00:07:28.728 17:01:24 -- common/autotest_common.sh@746 -- # xtrace_disable 00:07:28.728 17:01:24 -- common/autotest_common.sh@10 -- # set +x 00:07:28.988 17:01:24 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:07:28.988 17:01:24 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:28.988 17:01:24 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:28.988 17:01:24 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:28.988 17:01:24 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:28.988 17:01:24 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:07:28.988 17:01:24 -- common/autotest_common.sh@1455 -- # uname 00:07:28.988 17:01:24 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:07:28.988 17:01:24 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:07:28.988 17:01:24 -- common/autotest_common.sh@1475 -- # uname 00:07:28.988 17:01:24 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:07:28.988 17:01:24 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:07:28.988 17:01:24 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:07:28.988 17:01:24 -- spdk/autotest.sh@72 -- # hash lcov 00:07:28.988 17:01:24 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:07:28.988 17:01:24 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:07:28.988 --rc lcov_branch_coverage=1 00:07:28.988 --rc lcov_function_coverage=1 00:07:28.988 --rc genhtml_branch_coverage=1 00:07:28.988 --rc genhtml_function_coverage=1 00:07:28.988 --rc genhtml_legend=1 00:07:28.988 --rc geninfo_all_blocks=1 00:07:28.988 ' 00:07:28.988 17:01:24 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:07:28.988 --rc lcov_branch_coverage=1 00:07:28.988 --rc lcov_function_coverage=1 00:07:28.988 --rc genhtml_branch_coverage=1 00:07:28.988 --rc genhtml_function_coverage=1 00:07:28.988 --rc genhtml_legend=1 00:07:28.988 --rc geninfo_all_blocks=1 00:07:28.988 ' 00:07:28.988 17:01:24 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:07:28.988 --rc lcov_branch_coverage=1 00:07:28.988 --rc lcov_function_coverage=1 00:07:28.988 --rc genhtml_branch_coverage=1 00:07:28.988 --rc genhtml_function_coverage=1 00:07:28.988 --rc genhtml_legend=1 00:07:28.988 --rc geninfo_all_blocks=1 00:07:28.988 --no-external' 00:07:28.988 17:01:24 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:07:28.988 --rc lcov_branch_coverage=1 00:07:28.988 --rc lcov_function_coverage=1 00:07:28.988 --rc genhtml_branch_coverage=1 00:07:28.988 --rc genhtml_function_coverage=1 00:07:28.988 --rc genhtml_legend=1 00:07:28.988 --rc geninfo_all_blocks=1 00:07:28.988 --no-external' 00:07:28.988 17:01:24 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:07:28.988 lcov: LCOV version 1.14 00:07:28.988 17:01:24 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:07:47.082 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:07:47.082 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:07:59.358 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:07:59.358 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:07:59.617 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:07:59.617 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:07:59.876 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:07:59.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:07:59.877 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:07:59.877 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:07:59.877 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:07:59.877 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:07:59.877 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:07:59.877 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:07:59.877 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:07:59.877 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:07:59.877 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:07:59.877 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:07:59.877 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:08:00.136 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:08:00.136 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:08:00.137 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:08:00.137 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:08:04.332 17:01:59 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:08:04.333 17:01:59 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:04.333 17:01:59 -- common/autotest_common.sh@10 -- # set +x 00:08:04.333 17:01:59 -- spdk/autotest.sh@91 -- # rm -f 00:08:04.333 17:01:59 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:08:07.625 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:07.625 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:07.625 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:08:07.625 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:08:07.884 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:08:08.142 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:08:08.142 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:08:08.142 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:08:08.142 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:08:08.142 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:08:08.142 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:08:08.142 17:02:03 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:08:08.142 17:02:03 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:08:08.142 17:02:03 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:08:08.142 17:02:03 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:08:08.142 17:02:03 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:08:08.142 17:02:03 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:08:08.142 17:02:03 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:08:08.142 17:02:03 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:08:08.142 17:02:03 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:08:08.142 17:02:03 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:08:08.142 17:02:03 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:08:08.142 17:02:03 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:08:08.142 17:02:03 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:08:08.142 17:02:03 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:08:08.142 17:02:03 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:08:08.142 No valid GPT data, bailing 00:08:08.142 17:02:03 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:08:08.142 17:02:03 -- scripts/common.sh@391 -- # pt= 00:08:08.142 17:02:03 -- scripts/common.sh@392 -- # return 1 00:08:08.142 17:02:03 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:08:08.142 1+0 records in 00:08:08.142 1+0 records out 00:08:08.142 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0051587 s, 203 MB/s 00:08:08.142 17:02:03 -- spdk/autotest.sh@118 -- # sync 00:08:08.401 17:02:03 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:08:08.401 17:02:03 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:08:08.401 17:02:03 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:08:13.674 17:02:08 -- spdk/autotest.sh@124 -- # uname -s 00:08:13.674 17:02:08 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:08:13.674 17:02:08 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:08:13.674 17:02:08 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:13.674 17:02:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.675 17:02:08 -- common/autotest_common.sh@10 -- # set +x 00:08:13.675 ************************************ 00:08:13.675 START TEST setup.sh 00:08:13.675 ************************************ 00:08:13.675 17:02:08 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:08:13.675 * Looking for test storage... 00:08:13.675 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:08:13.675 17:02:08 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:08:13.675 17:02:08 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:08:13.675 17:02:08 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:08:13.675 17:02:08 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:13.675 17:02:08 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.675 17:02:08 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:08:13.675 ************************************ 00:08:13.675 START TEST acl 00:08:13.675 ************************************ 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:08:13.675 * Looking for test storage... 00:08:13.675 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:08:13.675 17:02:08 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:08:13.675 17:02:08 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:08:13.675 17:02:08 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:08:13.675 17:02:08 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:08:13.675 17:02:08 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:08:13.675 17:02:08 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:08:13.675 17:02:08 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:08:13.675 17:02:08 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:08:13.675 17:02:08 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:08:17.872 17:02:13 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:08:17.872 17:02:13 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:08:17.872 17:02:13 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:17.872 17:02:13 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:08:17.872 17:02:13 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:08:17.872 17:02:13 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:08:21.164 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:08:21.164 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:08:21.164 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.164 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:08:21.164 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:08:21.164 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.164 Hugepages 00:08:21.164 node hugesize free / total 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 00:08:21.424 Type BDF Vendor Device NUMA Driver Device Block devices 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:08:21.424 17:02:16 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:08:21.684 17:02:16 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:08:21.684 17:02:16 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:08:21.684 17:02:16 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:21.684 17:02:16 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.684 17:02:16 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:08:21.684 ************************************ 00:08:21.684 START TEST denied 00:08:21.684 ************************************ 00:08:21.684 17:02:16 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:08:21.684 17:02:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:08:21.684 17:02:16 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:08:21.684 17:02:16 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:08:21.684 17:02:16 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:08:21.684 17:02:16 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:25.883 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:08:25.883 17:02:20 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:08:31.226 00:08:31.226 real 0m9.210s 00:08:31.226 user 0m2.961s 00:08:31.226 sys 0m5.561s 00:08:31.226 17:02:26 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.226 17:02:26 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:08:31.226 ************************************ 00:08:31.226 END TEST denied 00:08:31.226 ************************************ 00:08:31.226 17:02:26 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:08:31.226 17:02:26 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:08:31.226 17:02:26 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.226 17:02:26 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.226 17:02:26 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:08:31.226 ************************************ 00:08:31.226 START TEST allowed 00:08:31.226 ************************************ 00:08:31.226 17:02:26 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:08:31.226 17:02:26 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:08:31.226 17:02:26 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:08:31.226 17:02:26 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:08:31.226 17:02:26 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:08:31.226 17:02:26 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:08:37.796 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:08:37.796 17:02:32 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:08:37.796 17:02:32 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:08:37.796 17:02:32 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:08:37.796 17:02:32 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:08:37.796 17:02:32 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:08:41.991 00:08:41.991 real 0m10.688s 00:08:41.991 user 0m2.753s 00:08:41.991 sys 0m5.470s 00:08:41.991 17:02:36 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.991 17:02:36 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:08:41.991 ************************************ 00:08:41.991 END TEST allowed 00:08:41.991 ************************************ 00:08:41.991 17:02:36 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:08:41.991 00:08:41.991 real 0m28.118s 00:08:41.991 user 0m8.582s 00:08:41.991 sys 0m16.686s 00:08:41.991 17:02:36 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.991 17:02:36 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:08:41.991 ************************************ 00:08:41.991 END TEST acl 00:08:41.991 ************************************ 00:08:41.991 17:02:36 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:08:41.991 17:02:36 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:08:41.991 17:02:36 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:41.991 17:02:36 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.991 17:02:36 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:08:41.991 ************************************ 00:08:41.991 START TEST hugepages 00:08:41.991 ************************************ 00:08:41.991 17:02:37 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:08:41.991 * Looking for test storage... 00:08:41.991 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 69747708 kB' 'MemAvailable: 73508196 kB' 'Buffers: 12196 kB' 'Cached: 16104624 kB' 'SwapCached: 0 kB' 'Active: 12976796 kB' 'Inactive: 3649220 kB' 'Active(anon): 12535260 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512544 kB' 'Mapped: 174784 kB' 'Shmem: 12026064 kB' 'KReclaimable: 505944 kB' 'Slab: 887904 kB' 'SReclaimable: 505944 kB' 'SUnreclaim: 381960 kB' 'KernelStack: 16128 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438192 kB' 'Committed_AS: 13999184 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201416 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.991 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.992 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:08:41.993 17:02:37 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:08:41.993 17:02:37 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:41.993 17:02:37 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.993 17:02:37 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:08:41.993 ************************************ 00:08:41.993 START TEST default_setup 00:08:41.993 ************************************ 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:08:41.993 17:02:37 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:45.284 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:45.284 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:45.284 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:45.284 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:45.284 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:45.543 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:48.080 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71900844 kB' 'MemAvailable: 75661392 kB' 'Buffers: 12196 kB' 'Cached: 16104748 kB' 'SwapCached: 0 kB' 'Active: 12997636 kB' 'Inactive: 3649220 kB' 'Active(anon): 12556100 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533140 kB' 'Mapped: 174948 kB' 'Shmem: 12026188 kB' 'KReclaimable: 506004 kB' 'Slab: 886400 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 380396 kB' 'KernelStack: 16336 kB' 'PageTables: 9148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14022272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201464 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.080 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.081 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.345 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71899916 kB' 'MemAvailable: 75660464 kB' 'Buffers: 12196 kB' 'Cached: 16104752 kB' 'SwapCached: 0 kB' 'Active: 12997268 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555732 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532796 kB' 'Mapped: 174932 kB' 'Shmem: 12026192 kB' 'KReclaimable: 506004 kB' 'Slab: 886452 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 380448 kB' 'KernelStack: 16272 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14022288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201384 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.346 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.347 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71899180 kB' 'MemAvailable: 75659728 kB' 'Buffers: 12196 kB' 'Cached: 16104768 kB' 'SwapCached: 0 kB' 'Active: 12997152 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555616 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532648 kB' 'Mapped: 174932 kB' 'Shmem: 12026208 kB' 'KReclaimable: 506004 kB' 'Slab: 886452 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 380448 kB' 'KernelStack: 16256 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14021060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201368 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.348 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:08:48.349 nr_hugepages=1024 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:08:48.349 resv_hugepages=0 00:08:48.349 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:08:48.349 surplus_hugepages=0 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:08:48.350 anon_hugepages=0 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71898612 kB' 'MemAvailable: 75659160 kB' 'Buffers: 12196 kB' 'Cached: 16104792 kB' 'SwapCached: 0 kB' 'Active: 12997068 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555532 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532516 kB' 'Mapped: 174932 kB' 'Shmem: 12026232 kB' 'KReclaimable: 506004 kB' 'Slab: 886452 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 380448 kB' 'KernelStack: 16160 kB' 'PageTables: 8656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14022332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201432 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.350 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.351 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 30324096 kB' 'MemUsed: 17792844 kB' 'SwapCached: 0 kB' 'Active: 11287604 kB' 'Inactive: 3412680 kB' 'Active(anon): 11065300 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439500 kB' 'Mapped: 123060 kB' 'AnonPages: 263900 kB' 'Shmem: 10804516 kB' 'KernelStack: 9288 kB' 'PageTables: 4552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328668 kB' 'Slab: 556540 kB' 'SReclaimable: 328668 kB' 'SUnreclaim: 227872 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.352 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:08:48.353 node0=1024 expecting 1024 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:08:48.353 00:08:48.353 real 0m6.463s 00:08:48.353 user 0m1.458s 00:08:48.353 sys 0m2.585s 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.353 17:02:43 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:08:48.353 ************************************ 00:08:48.353 END TEST default_setup 00:08:48.353 ************************************ 00:08:48.353 17:02:43 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:08:48.353 17:02:43 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:08:48.353 17:02:43 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:48.353 17:02:43 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:48.353 17:02:43 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:08:48.353 ************************************ 00:08:48.353 START TEST per_node_1G_alloc 00:08:48.353 ************************************ 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:08:48.613 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:08:48.614 17:02:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:52.816 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:52.816 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:52.816 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:08:52.816 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:08:52.816 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71897664 kB' 'MemAvailable: 75658212 kB' 'Buffers: 12196 kB' 'Cached: 16105044 kB' 'SwapCached: 0 kB' 'Active: 12995664 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554128 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530900 kB' 'Mapped: 173916 kB' 'Shmem: 12026484 kB' 'KReclaimable: 506004 kB' 'Slab: 885992 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 379988 kB' 'KernelStack: 16208 kB' 'PageTables: 8952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14010812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201672 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.816 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.817 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71897368 kB' 'MemAvailable: 75657916 kB' 'Buffers: 12196 kB' 'Cached: 16105048 kB' 'SwapCached: 0 kB' 'Active: 12995488 kB' 'Inactive: 3649220 kB' 'Active(anon): 12553952 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530728 kB' 'Mapped: 173876 kB' 'Shmem: 12026488 kB' 'KReclaimable: 506004 kB' 'Slab: 885964 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 379960 kB' 'KernelStack: 16272 kB' 'PageTables: 8756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14010828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201576 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.818 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.819 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71895920 kB' 'MemAvailable: 75656468 kB' 'Buffers: 12196 kB' 'Cached: 16105052 kB' 'SwapCached: 0 kB' 'Active: 12994936 kB' 'Inactive: 3649220 kB' 'Active(anon): 12553400 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530196 kB' 'Mapped: 173884 kB' 'Shmem: 12026492 kB' 'KReclaimable: 506004 kB' 'Slab: 885956 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 379952 kB' 'KernelStack: 16240 kB' 'PageTables: 8548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14010852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201592 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.820 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.821 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:08:52.822 nr_hugepages=1024 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:08:52.822 resv_hugepages=0 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:08:52.822 surplus_hugepages=0 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:08:52.822 anon_hugepages=0 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71895804 kB' 'MemAvailable: 75656352 kB' 'Buffers: 12196 kB' 'Cached: 16105088 kB' 'SwapCached: 0 kB' 'Active: 12995752 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554216 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531024 kB' 'Mapped: 173876 kB' 'Shmem: 12026528 kB' 'KReclaimable: 506004 kB' 'Slab: 885956 kB' 'SReclaimable: 506004 kB' 'SUnreclaim: 379952 kB' 'KernelStack: 16352 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14010872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201608 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.822 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.823 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 31365540 kB' 'MemUsed: 16751400 kB' 'SwapCached: 0 kB' 'Active: 11288052 kB' 'Inactive: 3412680 kB' 'Active(anon): 11065748 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439644 kB' 'Mapped: 122424 kB' 'AnonPages: 264304 kB' 'Shmem: 10804660 kB' 'KernelStack: 9352 kB' 'PageTables: 4676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328668 kB' 'Slab: 556256 kB' 'SReclaimable: 328668 kB' 'SUnreclaim: 227588 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.824 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.825 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 40531208 kB' 'MemUsed: 3645336 kB' 'SwapCached: 0 kB' 'Active: 1707284 kB' 'Inactive: 236540 kB' 'Active(anon): 1488052 kB' 'Inactive(anon): 0 kB' 'Active(file): 219232 kB' 'Inactive(file): 236540 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1677664 kB' 'Mapped: 51452 kB' 'AnonPages: 266192 kB' 'Shmem: 1221892 kB' 'KernelStack: 6824 kB' 'PageTables: 3712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 177336 kB' 'Slab: 329700 kB' 'SReclaimable: 177336 kB' 'SUnreclaim: 152364 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.826 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:08:52.827 node0=512 expecting 512 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:08:52.827 node1=512 expecting 512 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:08:52.827 00:08:52.827 real 0m4.020s 00:08:52.827 user 0m1.554s 00:08:52.827 sys 0m2.571s 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:52.827 17:02:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:08:52.827 ************************************ 00:08:52.827 END TEST per_node_1G_alloc 00:08:52.827 ************************************ 00:08:52.827 17:02:47 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:08:52.827 17:02:47 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:08:52.827 17:02:47 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:52.827 17:02:47 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:52.827 17:02:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:08:52.827 ************************************ 00:08:52.827 START TEST even_2G_alloc 00:08:52.827 ************************************ 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:08:52.827 17:02:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:56.121 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:56.121 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:56.121 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:08:56.121 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:08:56.121 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:08:56.387 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:08:56.387 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:08:56.387 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:08:56.387 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71869216 kB' 'MemAvailable: 75629748 kB' 'Buffers: 12196 kB' 'Cached: 16105476 kB' 'SwapCached: 0 kB' 'Active: 12995496 kB' 'Inactive: 3649220 kB' 'Active(anon): 12553960 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530320 kB' 'Mapped: 174008 kB' 'Shmem: 12026916 kB' 'KReclaimable: 505988 kB' 'Slab: 886524 kB' 'SReclaimable: 505988 kB' 'SUnreclaim: 380536 kB' 'KernelStack: 16416 kB' 'PageTables: 8936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14011924 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201528 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.387 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:56.388 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71868904 kB' 'MemAvailable: 75629404 kB' 'Buffers: 12196 kB' 'Cached: 16105480 kB' 'SwapCached: 0 kB' 'Active: 12995184 kB' 'Inactive: 3649220 kB' 'Active(anon): 12553648 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529920 kB' 'Mapped: 173884 kB' 'Shmem: 12026920 kB' 'KReclaimable: 505956 kB' 'Slab: 886496 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380540 kB' 'KernelStack: 16208 kB' 'PageTables: 8592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14010824 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201480 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.389 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:56.390 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71867704 kB' 'MemAvailable: 75628204 kB' 'Buffers: 12196 kB' 'Cached: 16105500 kB' 'SwapCached: 0 kB' 'Active: 12995676 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554140 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530388 kB' 'Mapped: 173884 kB' 'Shmem: 12026940 kB' 'KReclaimable: 505956 kB' 'Slab: 886496 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380540 kB' 'KernelStack: 16208 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14011980 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201544 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.391 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.392 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:08:56.393 nr_hugepages=1024 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:08:56.393 resv_hugepages=0 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:08:56.393 surplus_hugepages=0 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:08:56.393 anon_hugepages=0 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71867224 kB' 'MemAvailable: 75627724 kB' 'Buffers: 12196 kB' 'Cached: 16105536 kB' 'SwapCached: 0 kB' 'Active: 12995756 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554220 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530464 kB' 'Mapped: 173884 kB' 'Shmem: 12026976 kB' 'KReclaimable: 505956 kB' 'Slab: 886564 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380608 kB' 'KernelStack: 16320 kB' 'PageTables: 8688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14012000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201560 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.393 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.394 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.745 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 31344756 kB' 'MemUsed: 16772184 kB' 'SwapCached: 0 kB' 'Active: 11287352 kB' 'Inactive: 3412680 kB' 'Active(anon): 11065048 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439780 kB' 'Mapped: 122432 kB' 'AnonPages: 263508 kB' 'Shmem: 10804796 kB' 'KernelStack: 9336 kB' 'PageTables: 4576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328660 kB' 'Slab: 556528 kB' 'SReclaimable: 328660 kB' 'SUnreclaim: 227868 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.746 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 40525740 kB' 'MemUsed: 3650804 kB' 'SwapCached: 0 kB' 'Active: 1708588 kB' 'Inactive: 236540 kB' 'Active(anon): 1489356 kB' 'Inactive(anon): 0 kB' 'Active(file): 219232 kB' 'Inactive(file): 236540 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1677980 kB' 'Mapped: 51452 kB' 'AnonPages: 267200 kB' 'Shmem: 1222208 kB' 'KernelStack: 7000 kB' 'PageTables: 4244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 177296 kB' 'Slab: 330040 kB' 'SReclaimable: 177296 kB' 'SUnreclaim: 152744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.747 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:08:56.748 node0=512 expecting 512 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:08:56.748 node1=512 expecting 512 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:08:56.748 00:08:56.748 real 0m4.005s 00:08:56.748 user 0m1.519s 00:08:56.748 sys 0m2.586s 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:56.748 17:02:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:08:56.748 ************************************ 00:08:56.748 END TEST even_2G_alloc 00:08:56.748 ************************************ 00:08:56.748 17:02:51 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:08:56.748 17:02:51 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:08:56.748 17:02:51 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:56.748 17:02:51 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.748 17:02:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:08:56.748 ************************************ 00:08:56.748 START TEST odd_alloc 00:08:56.748 ************************************ 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:08:56.748 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:08:56.749 17:02:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:09:00.040 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:09:00.040 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:09:00.040 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:09:00.040 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:09:00.040 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:09:00.305 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71861284 kB' 'MemAvailable: 75621784 kB' 'Buffers: 12196 kB' 'Cached: 16105812 kB' 'SwapCached: 0 kB' 'Active: 12996804 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555268 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530696 kB' 'Mapped: 174004 kB' 'Shmem: 12027252 kB' 'KReclaimable: 505956 kB' 'Slab: 886572 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380616 kB' 'KernelStack: 16176 kB' 'PageTables: 8440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 14009900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201576 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.306 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71860828 kB' 'MemAvailable: 75621328 kB' 'Buffers: 12196 kB' 'Cached: 16105816 kB' 'SwapCached: 0 kB' 'Active: 12996036 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554500 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530412 kB' 'Mapped: 173896 kB' 'Shmem: 12027256 kB' 'KReclaimable: 505956 kB' 'Slab: 886500 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380544 kB' 'KernelStack: 16176 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 14009916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201544 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.307 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.308 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71861332 kB' 'MemAvailable: 75621832 kB' 'Buffers: 12196 kB' 'Cached: 16105816 kB' 'SwapCached: 0 kB' 'Active: 12996072 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554536 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530444 kB' 'Mapped: 173896 kB' 'Shmem: 12027256 kB' 'KReclaimable: 505956 kB' 'Slab: 886496 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380540 kB' 'KernelStack: 16192 kB' 'PageTables: 8476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 14009936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201544 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.309 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.310 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:09:00.311 nr_hugepages=1025 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:09:00.311 resv_hugepages=0 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:09:00.311 surplus_hugepages=0 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:09:00.311 anon_hugepages=0 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.311 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71861168 kB' 'MemAvailable: 75621668 kB' 'Buffers: 12196 kB' 'Cached: 16105852 kB' 'SwapCached: 0 kB' 'Active: 12996648 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555112 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531052 kB' 'Mapped: 173896 kB' 'Shmem: 12027292 kB' 'KReclaimable: 505956 kB' 'Slab: 886496 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380540 kB' 'KernelStack: 16176 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 14011212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201512 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.312 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 31338620 kB' 'MemUsed: 16778320 kB' 'SwapCached: 0 kB' 'Active: 11287756 kB' 'Inactive: 3412680 kB' 'Active(anon): 11065452 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439860 kB' 'Mapped: 122444 kB' 'AnonPages: 263704 kB' 'Shmem: 10804876 kB' 'KernelStack: 9448 kB' 'PageTables: 4536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328660 kB' 'Slab: 556572 kB' 'SReclaimable: 328660 kB' 'SUnreclaim: 227912 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.313 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.314 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 40522096 kB' 'MemUsed: 3654448 kB' 'SwapCached: 0 kB' 'Active: 1708564 kB' 'Inactive: 236540 kB' 'Active(anon): 1489332 kB' 'Inactive(anon): 0 kB' 'Active(file): 219232 kB' 'Inactive(file): 236540 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1678232 kB' 'Mapped: 51452 kB' 'AnonPages: 266976 kB' 'Shmem: 1222460 kB' 'KernelStack: 6840 kB' 'PageTables: 3856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 177296 kB' 'Slab: 329924 kB' 'SReclaimable: 177296 kB' 'SUnreclaim: 152628 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.315 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:09:00.316 node0=512 expecting 513 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:09:00.316 node1=513 expecting 512 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:09:00.316 00:09:00.316 real 0m3.739s 00:09:00.316 user 0m1.329s 00:09:00.316 sys 0m2.488s 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:00.316 17:02:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:09:00.316 ************************************ 00:09:00.316 END TEST odd_alloc 00:09:00.316 ************************************ 00:09:00.576 17:02:55 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:09:00.576 17:02:55 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:09:00.576 17:02:55 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:00.576 17:02:55 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:00.576 17:02:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:09:00.576 ************************************ 00:09:00.576 START TEST custom_alloc 00:09:00.576 ************************************ 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:09:00.576 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:09:00.577 17:02:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:09:04.774 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:09:04.774 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:09:04.774 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:09:04.774 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:09:04.774 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.774 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 70805232 kB' 'MemAvailable: 74565732 kB' 'Buffers: 12196 kB' 'Cached: 16105968 kB' 'SwapCached: 0 kB' 'Active: 12996336 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554800 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530716 kB' 'Mapped: 173940 kB' 'Shmem: 12027408 kB' 'KReclaimable: 505956 kB' 'Slab: 886104 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380148 kB' 'KernelStack: 16208 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 14010568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201448 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.775 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 70813896 kB' 'MemAvailable: 74574396 kB' 'Buffers: 12196 kB' 'Cached: 16105968 kB' 'SwapCached: 0 kB' 'Active: 12996528 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554992 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530892 kB' 'Mapped: 173912 kB' 'Shmem: 12027408 kB' 'KReclaimable: 505956 kB' 'Slab: 886144 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380188 kB' 'KernelStack: 16160 kB' 'PageTables: 8376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 14011704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201400 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.776 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.777 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 70814376 kB' 'MemAvailable: 74574876 kB' 'Buffers: 12196 kB' 'Cached: 16105988 kB' 'SwapCached: 0 kB' 'Active: 12996532 kB' 'Inactive: 3649220 kB' 'Active(anon): 12554996 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530776 kB' 'Mapped: 173912 kB' 'Shmem: 12027428 kB' 'KReclaimable: 505956 kB' 'Slab: 886128 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380172 kB' 'KernelStack: 16112 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 14012964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201416 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.778 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.779 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:09:04.780 nr_hugepages=1536 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:09:04.780 resv_hugepages=0 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:09:04.780 surplus_hugepages=0 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:09:04.780 anon_hugepages=0 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 70812868 kB' 'MemAvailable: 74573368 kB' 'Buffers: 12196 kB' 'Cached: 16106008 kB' 'SwapCached: 0 kB' 'Active: 12997184 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555648 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531412 kB' 'Mapped: 173912 kB' 'Shmem: 12027448 kB' 'KReclaimable: 505956 kB' 'Slab: 886128 kB' 'SReclaimable: 505956 kB' 'SUnreclaim: 380172 kB' 'KernelStack: 16336 kB' 'PageTables: 8864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 14013236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201512 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.780 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.781 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 31348140 kB' 'MemUsed: 16768800 kB' 'SwapCached: 0 kB' 'Active: 11289188 kB' 'Inactive: 3412680 kB' 'Active(anon): 11066884 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439892 kB' 'Mapped: 122460 kB' 'AnonPages: 265148 kB' 'Shmem: 10804908 kB' 'KernelStack: 9432 kB' 'PageTables: 4732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328660 kB' 'Slab: 556376 kB' 'SReclaimable: 328660 kB' 'SUnreclaim: 227716 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.782 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.783 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 39464188 kB' 'MemUsed: 4712356 kB' 'SwapCached: 0 kB' 'Active: 1708216 kB' 'Inactive: 236540 kB' 'Active(anon): 1488984 kB' 'Inactive(anon): 0 kB' 'Active(file): 219232 kB' 'Inactive(file): 236540 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1678332 kB' 'Mapped: 51452 kB' 'AnonPages: 265952 kB' 'Shmem: 1222560 kB' 'KernelStack: 6808 kB' 'PageTables: 3756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 177296 kB' 'Slab: 329752 kB' 'SReclaimable: 177296 kB' 'SUnreclaim: 152456 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.784 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:09:04.785 node0=512 expecting 512 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:09:04.785 node1=1024 expecting 1024 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:09:04.785 00:09:04.785 real 0m4.012s 00:09:04.785 user 0m1.526s 00:09:04.785 sys 0m2.593s 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.785 17:02:59 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:09:04.785 ************************************ 00:09:04.785 END TEST custom_alloc 00:09:04.785 ************************************ 00:09:04.785 17:02:59 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:09:04.785 17:02:59 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:09:04.785 17:02:59 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:04.785 17:02:59 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.785 17:02:59 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:09:04.785 ************************************ 00:09:04.785 START TEST no_shrink_alloc 00:09:04.785 ************************************ 00:09:04.785 17:02:59 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:09:04.785 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:09:04.785 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:09:04.785 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:09:04.786 17:02:59 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:09:08.987 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:09:08.987 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:09:08.987 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:09:08.987 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:09:08.988 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:09:08.988 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71747160 kB' 'MemAvailable: 75507652 kB' 'Buffers: 12196 kB' 'Cached: 16106112 kB' 'SwapCached: 0 kB' 'Active: 13003564 kB' 'Inactive: 3649220 kB' 'Active(anon): 12562028 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537236 kB' 'Mapped: 174880 kB' 'Shmem: 12027552 kB' 'KReclaimable: 505948 kB' 'Slab: 886396 kB' 'SReclaimable: 505948 kB' 'SUnreclaim: 380448 kB' 'KernelStack: 16128 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14021460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201580 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.988 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.989 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71746248 kB' 'MemAvailable: 75506740 kB' 'Buffers: 12196 kB' 'Cached: 16106116 kB' 'SwapCached: 0 kB' 'Active: 13003408 kB' 'Inactive: 3649220 kB' 'Active(anon): 12561872 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537584 kB' 'Mapped: 174796 kB' 'Shmem: 12027556 kB' 'KReclaimable: 505948 kB' 'Slab: 886376 kB' 'SReclaimable: 505948 kB' 'SUnreclaim: 380428 kB' 'KernelStack: 16176 kB' 'PageTables: 8880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14022968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201660 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.990 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.991 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71750228 kB' 'MemAvailable: 75510720 kB' 'Buffers: 12196 kB' 'Cached: 16106116 kB' 'SwapCached: 0 kB' 'Active: 13003032 kB' 'Inactive: 3649220 kB' 'Active(anon): 12561496 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537132 kB' 'Mapped: 174796 kB' 'Shmem: 12027556 kB' 'KReclaimable: 505948 kB' 'Slab: 886368 kB' 'SReclaimable: 505948 kB' 'SUnreclaim: 380420 kB' 'KernelStack: 16208 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14022988 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201660 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.992 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.993 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:09:08.994 nr_hugepages=1024 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:09:08.994 resv_hugepages=0 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:09:08.994 surplus_hugepages=0 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:09:08.994 anon_hugepages=0 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71749268 kB' 'MemAvailable: 75509760 kB' 'Buffers: 12196 kB' 'Cached: 16106156 kB' 'SwapCached: 0 kB' 'Active: 13003168 kB' 'Inactive: 3649220 kB' 'Active(anon): 12561632 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537196 kB' 'Mapped: 174796 kB' 'Shmem: 12027596 kB' 'KReclaimable: 505948 kB' 'Slab: 886368 kB' 'SReclaimable: 505948 kB' 'SUnreclaim: 380420 kB' 'KernelStack: 16224 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14023012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201692 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.994 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.995 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 30242576 kB' 'MemUsed: 17874364 kB' 'SwapCached: 0 kB' 'Active: 11287440 kB' 'Inactive: 3412680 kB' 'Active(anon): 11065136 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439896 kB' 'Mapped: 123192 kB' 'AnonPages: 263268 kB' 'Shmem: 10804912 kB' 'KernelStack: 9464 kB' 'PageTables: 4868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328652 kB' 'Slab: 556688 kB' 'SReclaimable: 328652 kB' 'SUnreclaim: 228036 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.996 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:09:08.997 node0=1024 expecting 1024 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:09:08.997 17:03:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:09:12.295 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:09:12.295 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:09:12.295 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:09:12.295 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:09:12.295 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:09:12.295 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71789540 kB' 'MemAvailable: 75550024 kB' 'Buffers: 12196 kB' 'Cached: 16106252 kB' 'SwapCached: 0 kB' 'Active: 12997340 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555804 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531352 kB' 'Mapped: 173980 kB' 'Shmem: 12027692 kB' 'KReclaimable: 505940 kB' 'Slab: 885836 kB' 'SReclaimable: 505940 kB' 'SUnreclaim: 379896 kB' 'KernelStack: 16192 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14011832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201464 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.295 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.296 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71791716 kB' 'MemAvailable: 75552200 kB' 'Buffers: 12196 kB' 'Cached: 16106256 kB' 'SwapCached: 0 kB' 'Active: 12996992 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555456 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531036 kB' 'Mapped: 173940 kB' 'Shmem: 12027696 kB' 'KReclaimable: 505940 kB' 'Slab: 885824 kB' 'SReclaimable: 505940 kB' 'SUnreclaim: 379884 kB' 'KernelStack: 16176 kB' 'PageTables: 8432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14011852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201432 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.297 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.298 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71791716 kB' 'MemAvailable: 75552200 kB' 'Buffers: 12196 kB' 'Cached: 16106256 kB' 'SwapCached: 0 kB' 'Active: 12997028 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555492 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531068 kB' 'Mapped: 173940 kB' 'Shmem: 12027696 kB' 'KReclaimable: 505940 kB' 'Slab: 885824 kB' 'SReclaimable: 505940 kB' 'SUnreclaim: 379884 kB' 'KernelStack: 16192 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14011872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201432 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.299 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.300 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:09:12.301 nr_hugepages=1024 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:09:12.301 resv_hugepages=0 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:09:12.301 surplus_hugepages=0 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:09:12.301 anon_hugepages=0 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 71791392 kB' 'MemAvailable: 75551876 kB' 'Buffers: 12196 kB' 'Cached: 16106312 kB' 'SwapCached: 0 kB' 'Active: 12996672 kB' 'Inactive: 3649220 kB' 'Active(anon): 12555136 kB' 'Inactive(anon): 0 kB' 'Active(file): 441536 kB' 'Inactive(file): 3649220 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530624 kB' 'Mapped: 173940 kB' 'Shmem: 12027752 kB' 'KReclaimable: 505940 kB' 'Slab: 885824 kB' 'SReclaimable: 505940 kB' 'SUnreclaim: 379884 kB' 'KernelStack: 16160 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 14011896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201432 kB' 'VmallocChunk: 0 kB' 'Percpu: 67840 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1144228 kB' 'DirectMap2M: 19503104 kB' 'DirectMap1G: 80740352 kB' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.301 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.302 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.303 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.303 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.563 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 30271644 kB' 'MemUsed: 17845296 kB' 'SwapCached: 0 kB' 'Active: 11286752 kB' 'Inactive: 3412680 kB' 'Active(anon): 11064448 kB' 'Inactive(anon): 0 kB' 'Active(file): 222304 kB' 'Inactive(file): 3412680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14439928 kB' 'Mapped: 122488 kB' 'AnonPages: 262576 kB' 'Shmem: 10804944 kB' 'KernelStack: 9288 kB' 'PageTables: 4444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 328652 kB' 'Slab: 556248 kB' 'SReclaimable: 328652 kB' 'SUnreclaim: 227596 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.564 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:09:12.565 node0=1024 expecting 1024 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:09:12.565 00:09:12.565 real 0m7.860s 00:09:12.565 user 0m2.861s 00:09:12.565 sys 0m5.203s 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.565 17:03:07 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:09:12.565 ************************************ 00:09:12.565 END TEST no_shrink_alloc 00:09:12.565 ************************************ 00:09:12.565 17:03:07 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:09:12.565 17:03:07 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:09:12.565 00:09:12.565 real 0m30.800s 00:09:12.565 user 0m10.518s 00:09:12.565 sys 0m18.507s 00:09:12.565 17:03:07 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.565 17:03:07 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:09:12.565 ************************************ 00:09:12.565 END TEST hugepages 00:09:12.565 ************************************ 00:09:12.565 17:03:07 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:09:12.565 17:03:07 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:09:12.565 17:03:07 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:12.565 17:03:07 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.565 17:03:07 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:09:12.565 ************************************ 00:09:12.565 START TEST driver 00:09:12.565 ************************************ 00:09:12.565 17:03:07 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:09:12.825 * Looking for test storage... 00:09:12.825 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:09:12.825 17:03:08 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:09:12.825 17:03:08 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:09:12.825 17:03:08 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:09:18.104 17:03:13 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:09:18.104 17:03:13 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:18.104 17:03:13 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:18.104 17:03:13 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:09:18.104 ************************************ 00:09:18.104 START TEST guess_driver 00:09:18.104 ************************************ 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:09:18.104 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:09:18.104 Looking for driver=vfio-pci 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:09:18.104 17:03:13 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:16 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:22.297 17:03:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:24.203 17:03:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:09:24.203 17:03:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:09:24.203 17:03:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:09:24.494 17:03:19 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:09:24.494 17:03:19 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:09:24.494 17:03:19 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:09:24.494 17:03:19 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:09:29.778 00:09:29.778 real 0m11.490s 00:09:29.778 user 0m2.775s 00:09:29.778 sys 0m5.537s 00:09:29.778 17:03:24 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:29.778 17:03:24 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:09:29.778 ************************************ 00:09:29.778 END TEST guess_driver 00:09:29.778 ************************************ 00:09:29.778 17:03:24 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:09:29.778 00:09:29.778 real 0m16.840s 00:09:29.779 user 0m4.305s 00:09:29.779 sys 0m8.586s 00:09:29.779 17:03:24 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:29.779 17:03:24 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:09:29.779 ************************************ 00:09:29.779 END TEST driver 00:09:29.779 ************************************ 00:09:29.779 17:03:24 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:09:29.779 17:03:24 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:09:29.779 17:03:24 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:29.779 17:03:24 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:29.779 17:03:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:09:29.779 ************************************ 00:09:29.779 START TEST devices 00:09:29.779 ************************************ 00:09:29.779 17:03:24 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:09:29.779 * Looking for test storage... 00:09:29.779 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:09:29.779 17:03:24 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:09:29.779 17:03:24 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:09:29.779 17:03:24 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:09:29.779 17:03:24 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:09:33.970 17:03:29 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:09:33.970 No valid GPT data, bailing 00:09:33.970 17:03:29 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:09:33.970 17:03:29 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:09:33.970 17:03:29 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:09:33.970 17:03:29 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:33.970 17:03:29 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:09:33.970 ************************************ 00:09:33.970 START TEST nvme_mount 00:09:33.970 ************************************ 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:09:33.970 17:03:29 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:09:34.908 Creating new GPT entries in memory. 00:09:34.908 GPT data structures destroyed! You may now partition the disk using fdisk or 00:09:34.908 other utilities. 00:09:34.908 17:03:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:09:34.908 17:03:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:09:34.908 17:03:30 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:09:34.908 17:03:30 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:09:34.908 17:03:30 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:09:36.288 Creating new GPT entries in memory. 00:09:36.288 The operation has completed successfully. 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 4051685 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:09:36.288 17:03:31 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.578 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:39.579 17:03:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:09:39.838 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:09:39.838 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:09:40.097 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:09:40.097 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:09:40.097 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:09:40.097 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:09:40.097 17:03:35 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:44.289 17:03:38 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:09:44.289 17:03:39 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.579 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:09:47.580 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:09:47.580 00:09:47.580 real 0m13.699s 00:09:47.580 user 0m4.090s 00:09:47.580 sys 0m7.628s 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.580 17:03:42 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:09:47.580 ************************************ 00:09:47.580 END TEST nvme_mount 00:09:47.580 ************************************ 00:09:47.839 17:03:43 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:09:47.840 17:03:43 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:09:47.840 17:03:43 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:47.840 17:03:43 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:47.840 17:03:43 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:09:47.840 ************************************ 00:09:47.840 START TEST dm_mount 00:09:47.840 ************************************ 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:09:47.840 17:03:43 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:09:48.777 Creating new GPT entries in memory. 00:09:48.777 GPT data structures destroyed! You may now partition the disk using fdisk or 00:09:48.777 other utilities. 00:09:48.777 17:03:44 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:09:48.777 17:03:44 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:09:48.777 17:03:44 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:09:48.777 17:03:44 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:09:48.777 17:03:44 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:09:49.715 Creating new GPT entries in memory. 00:09:49.715 The operation has completed successfully. 00:09:49.715 17:03:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:09:49.715 17:03:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:09:49.715 17:03:45 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:09:49.715 17:03:45 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:09:49.715 17:03:45 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:09:51.096 The operation has completed successfully. 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 4055994 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:09:51.096 17:03:46 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.436 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.437 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.437 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.437 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:54.437 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:09:54.696 17:03:49 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:09:58.890 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:09:58.891 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:09:58.891 00:09:58.891 real 0m10.759s 00:09:58.891 user 0m2.728s 00:09:58.891 sys 0m5.173s 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:58.891 17:03:53 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:09:58.891 ************************************ 00:09:58.891 END TEST dm_mount 00:09:58.891 ************************************ 00:09:58.891 17:03:53 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:09:58.891 17:03:53 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:09:58.891 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:09:58.891 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:09:58.891 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:09:58.891 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:09:58.891 17:03:54 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:09:58.891 00:09:58.891 real 0m29.328s 00:09:58.891 user 0m8.467s 00:09:58.891 sys 0m15.962s 00:09:58.891 17:03:54 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:58.891 17:03:54 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:09:58.891 ************************************ 00:09:58.891 END TEST devices 00:09:58.891 ************************************ 00:09:58.891 17:03:54 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:09:58.891 00:09:58.891 real 1m45.561s 00:09:58.891 user 0m32.054s 00:09:58.891 sys 1m0.075s 00:09:58.891 17:03:54 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:58.891 17:03:54 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:09:58.891 ************************************ 00:09:58.891 END TEST setup.sh 00:09:58.891 ************************************ 00:09:58.891 17:03:54 -- common/autotest_common.sh@1142 -- # return 0 00:09:58.891 17:03:54 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:10:03.082 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:10:03.083 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:10:03.083 Hugepages 00:10:03.083 node hugesize free / total 00:10:03.083 node0 1048576kB 0 / 0 00:10:03.083 node0 2048kB 1024 / 1024 00:10:03.083 node1 1048576kB 0 / 0 00:10:03.083 node1 2048kB 1024 / 1024 00:10:03.083 00:10:03.083 Type BDF Vendor Device NUMA Driver Device Block devices 00:10:03.083 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:10:03.083 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:10:03.083 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:10:03.083 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:10:03.083 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:10:03.083 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:10:03.083 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:10:03.083 17:03:58 -- spdk/autotest.sh@130 -- # uname -s 00:10:03.083 17:03:58 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:10:03.083 17:03:58 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:10:03.083 17:03:58 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:10:06.377 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:10:06.377 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:10:06.377 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:10:06.377 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:10:06.377 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:10:06.377 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:10:06.377 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:10:06.377 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:10:06.637 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:10:09.175 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:10:09.175 17:04:04 -- common/autotest_common.sh@1532 -- # sleep 1 00:10:10.115 17:04:05 -- common/autotest_common.sh@1533 -- # bdfs=() 00:10:10.115 17:04:05 -- common/autotest_common.sh@1533 -- # local bdfs 00:10:10.115 17:04:05 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:10:10.115 17:04:05 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:10:10.115 17:04:05 -- common/autotest_common.sh@1513 -- # bdfs=() 00:10:10.115 17:04:05 -- common/autotest_common.sh@1513 -- # local bdfs 00:10:10.115 17:04:05 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:10:10.115 17:04:05 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:10:10.115 17:04:05 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:10:10.375 17:04:05 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:10:10.375 17:04:05 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:10:10.375 17:04:05 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:10:13.667 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:10:13.667 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:10:13.667 Waiting for block devices as requested 00:10:13.667 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:10:13.927 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:13.927 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:13.927 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:14.186 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:14.186 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:14.186 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:14.446 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:14.446 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:14.446 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:14.705 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:14.705 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:14.705 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:14.965 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:14.965 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:14.965 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:15.224 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:15.224 17:04:10 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:10:15.224 17:04:10 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:10:15.224 17:04:10 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:10:15.224 17:04:10 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:10:15.224 17:04:10 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1545 -- # grep oacs 00:10:15.224 17:04:10 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:10:15.224 17:04:10 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:10:15.224 17:04:10 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:10:15.224 17:04:10 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:10:15.224 17:04:10 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:10:15.224 17:04:10 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:10:15.224 17:04:10 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:10:15.224 17:04:10 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:10:15.224 17:04:10 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:10:15.224 17:04:10 -- common/autotest_common.sh@1557 -- # continue 00:10:15.224 17:04:10 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:10:15.224 17:04:10 -- common/autotest_common.sh@728 -- # xtrace_disable 00:10:15.224 17:04:10 -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 17:04:10 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:10:15.224 17:04:10 -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:15.224 17:04:10 -- common/autotest_common.sh@10 -- # set +x 00:10:15.224 17:04:10 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:10:19.420 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:10:19.420 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:10:19.420 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:10:19.420 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:10:21.974 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:10:21.974 17:04:16 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:10:21.974 17:04:16 -- common/autotest_common.sh@728 -- # xtrace_disable 00:10:21.974 17:04:16 -- common/autotest_common.sh@10 -- # set +x 00:10:21.974 17:04:16 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:10:21.974 17:04:16 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:10:21.974 17:04:16 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:10:21.974 17:04:16 -- common/autotest_common.sh@1577 -- # bdfs=() 00:10:21.974 17:04:16 -- common/autotest_common.sh@1577 -- # local bdfs 00:10:21.975 17:04:16 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:10:21.975 17:04:16 -- common/autotest_common.sh@1513 -- # bdfs=() 00:10:21.975 17:04:16 -- common/autotest_common.sh@1513 -- # local bdfs 00:10:21.975 17:04:16 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:10:21.975 17:04:16 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:10:21.975 17:04:16 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:10:21.975 17:04:17 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:10:21.975 17:04:17 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:10:21.975 17:04:17 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:10:21.975 17:04:17 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:10:21.975 17:04:17 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:10:21.975 17:04:17 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:10:21.975 17:04:17 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:10:21.975 17:04:17 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:10:21.975 17:04:17 -- common/autotest_common.sh@1593 -- # return 0 00:10:21.975 17:04:17 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:10:21.975 17:04:17 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:10:21.975 17:04:17 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:10:21.975 17:04:17 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:10:21.975 17:04:17 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:10:22.588 Restarting all devices. 00:10:25.878 lstat() error: No such file or directory 00:10:25.878 QAT Error: No GENERAL section found 00:10:25.878 Failed to configure qat_dev0 00:10:25.878 lstat() error: No such file or directory 00:10:25.878 QAT Error: No GENERAL section found 00:10:25.878 Failed to configure qat_dev1 00:10:25.878 lstat() error: No such file or directory 00:10:25.878 QAT Error: No GENERAL section found 00:10:25.878 Failed to configure qat_dev2 00:10:25.878 enable sriov 00:10:25.878 Checking status of all devices. 00:10:25.878 There is 3 QAT acceleration device(s) in the system: 00:10:25.878 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:10:25.878 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:10:25.878 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:10:26.813 0000:3d:00.0 set to 16 VFs 00:10:27.382 0000:3f:00.0 set to 16 VFs 00:10:28.319 0000:da:00.0 set to 16 VFs 00:10:29.697 Properly configured the qat device with driver uio_pci_generic. 00:10:29.697 17:04:25 -- spdk/autotest.sh@162 -- # timing_enter lib 00:10:29.697 17:04:25 -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:29.697 17:04:25 -- common/autotest_common.sh@10 -- # set +x 00:10:29.697 17:04:25 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:10:29.697 17:04:25 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:10:29.697 17:04:25 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:29.697 17:04:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:29.697 17:04:25 -- common/autotest_common.sh@10 -- # set +x 00:10:29.697 ************************************ 00:10:29.697 START TEST env 00:10:29.697 ************************************ 00:10:29.697 17:04:25 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:10:29.957 * Looking for test storage... 00:10:29.957 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:10:29.957 17:04:25 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:10:29.957 17:04:25 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:29.957 17:04:25 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:29.957 17:04:25 env -- common/autotest_common.sh@10 -- # set +x 00:10:29.957 ************************************ 00:10:29.957 START TEST env_memory 00:10:29.957 ************************************ 00:10:29.957 17:04:25 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:10:29.957 00:10:29.957 00:10:29.957 CUnit - A unit testing framework for C - Version 2.1-3 00:10:29.957 http://cunit.sourceforge.net/ 00:10:29.957 00:10:29.957 00:10:29.957 Suite: memory 00:10:29.957 Test: alloc and free memory map ...[2024-07-23 17:04:25.264873] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:10:29.957 passed 00:10:29.957 Test: mem map translation ...[2024-07-23 17:04:25.294363] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:10:29.957 [2024-07-23 17:04:25.294389] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:10:29.957 [2024-07-23 17:04:25.294445] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:10:29.957 [2024-07-23 17:04:25.294463] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:10:29.957 passed 00:10:29.957 Test: mem map registration ...[2024-07-23 17:04:25.352269] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:10:29.957 [2024-07-23 17:04:25.352293] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:10:29.957 passed 00:10:30.218 Test: mem map adjacent registrations ...passed 00:10:30.218 00:10:30.218 Run Summary: Type Total Ran Passed Failed Inactive 00:10:30.218 suites 1 1 n/a 0 0 00:10:30.218 tests 4 4 4 0 0 00:10:30.218 asserts 152 152 152 0 n/a 00:10:30.218 00:10:30.218 Elapsed time = 0.200 seconds 00:10:30.218 00:10:30.218 real 0m0.215s 00:10:30.218 user 0m0.201s 00:10:30.218 sys 0m0.013s 00:10:30.218 17:04:25 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:30.218 17:04:25 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:10:30.218 ************************************ 00:10:30.218 END TEST env_memory 00:10:30.218 ************************************ 00:10:30.218 17:04:25 env -- common/autotest_common.sh@1142 -- # return 0 00:10:30.218 17:04:25 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:10:30.218 17:04:25 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:30.218 17:04:25 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:30.218 17:04:25 env -- common/autotest_common.sh@10 -- # set +x 00:10:30.218 ************************************ 00:10:30.218 START TEST env_vtophys 00:10:30.218 ************************************ 00:10:30.218 17:04:25 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:10:30.218 EAL: lib.eal log level changed from notice to debug 00:10:30.218 EAL: Detected lcore 0 as core 0 on socket 0 00:10:30.218 EAL: Detected lcore 1 as core 1 on socket 0 00:10:30.218 EAL: Detected lcore 2 as core 2 on socket 0 00:10:30.218 EAL: Detected lcore 3 as core 3 on socket 0 00:10:30.218 EAL: Detected lcore 4 as core 4 on socket 0 00:10:30.218 EAL: Detected lcore 5 as core 8 on socket 0 00:10:30.218 EAL: Detected lcore 6 as core 9 on socket 0 00:10:30.218 EAL: Detected lcore 7 as core 10 on socket 0 00:10:30.218 EAL: Detected lcore 8 as core 11 on socket 0 00:10:30.218 EAL: Detected lcore 9 as core 16 on socket 0 00:10:30.218 EAL: Detected lcore 10 as core 17 on socket 0 00:10:30.218 EAL: Detected lcore 11 as core 18 on socket 0 00:10:30.218 EAL: Detected lcore 12 as core 19 on socket 0 00:10:30.218 EAL: Detected lcore 13 as core 20 on socket 0 00:10:30.218 EAL: Detected lcore 14 as core 24 on socket 0 00:10:30.218 EAL: Detected lcore 15 as core 25 on socket 0 00:10:30.218 EAL: Detected lcore 16 as core 26 on socket 0 00:10:30.218 EAL: Detected lcore 17 as core 27 on socket 0 00:10:30.218 EAL: Detected lcore 18 as core 0 on socket 1 00:10:30.218 EAL: Detected lcore 19 as core 1 on socket 1 00:10:30.218 EAL: Detected lcore 20 as core 2 on socket 1 00:10:30.218 EAL: Detected lcore 21 as core 3 on socket 1 00:10:30.218 EAL: Detected lcore 22 as core 4 on socket 1 00:10:30.218 EAL: Detected lcore 23 as core 8 on socket 1 00:10:30.218 EAL: Detected lcore 24 as core 9 on socket 1 00:10:30.218 EAL: Detected lcore 25 as core 10 on socket 1 00:10:30.218 EAL: Detected lcore 26 as core 11 on socket 1 00:10:30.218 EAL: Detected lcore 27 as core 16 on socket 1 00:10:30.218 EAL: Detected lcore 28 as core 17 on socket 1 00:10:30.218 EAL: Detected lcore 29 as core 18 on socket 1 00:10:30.218 EAL: Detected lcore 30 as core 19 on socket 1 00:10:30.218 EAL: Detected lcore 31 as core 20 on socket 1 00:10:30.218 EAL: Detected lcore 32 as core 24 on socket 1 00:10:30.218 EAL: Detected lcore 33 as core 25 on socket 1 00:10:30.218 EAL: Detected lcore 34 as core 26 on socket 1 00:10:30.218 EAL: Detected lcore 35 as core 27 on socket 1 00:10:30.218 EAL: Detected lcore 36 as core 0 on socket 0 00:10:30.218 EAL: Detected lcore 37 as core 1 on socket 0 00:10:30.218 EAL: Detected lcore 38 as core 2 on socket 0 00:10:30.218 EAL: Detected lcore 39 as core 3 on socket 0 00:10:30.218 EAL: Detected lcore 40 as core 4 on socket 0 00:10:30.218 EAL: Detected lcore 41 as core 8 on socket 0 00:10:30.218 EAL: Detected lcore 42 as core 9 on socket 0 00:10:30.218 EAL: Detected lcore 43 as core 10 on socket 0 00:10:30.218 EAL: Detected lcore 44 as core 11 on socket 0 00:10:30.218 EAL: Detected lcore 45 as core 16 on socket 0 00:10:30.218 EAL: Detected lcore 46 as core 17 on socket 0 00:10:30.218 EAL: Detected lcore 47 as core 18 on socket 0 00:10:30.218 EAL: Detected lcore 48 as core 19 on socket 0 00:10:30.218 EAL: Detected lcore 49 as core 20 on socket 0 00:10:30.218 EAL: Detected lcore 50 as core 24 on socket 0 00:10:30.218 EAL: Detected lcore 51 as core 25 on socket 0 00:10:30.218 EAL: Detected lcore 52 as core 26 on socket 0 00:10:30.218 EAL: Detected lcore 53 as core 27 on socket 0 00:10:30.218 EAL: Detected lcore 54 as core 0 on socket 1 00:10:30.218 EAL: Detected lcore 55 as core 1 on socket 1 00:10:30.218 EAL: Detected lcore 56 as core 2 on socket 1 00:10:30.218 EAL: Detected lcore 57 as core 3 on socket 1 00:10:30.218 EAL: Detected lcore 58 as core 4 on socket 1 00:10:30.218 EAL: Detected lcore 59 as core 8 on socket 1 00:10:30.218 EAL: Detected lcore 60 as core 9 on socket 1 00:10:30.218 EAL: Detected lcore 61 as core 10 on socket 1 00:10:30.218 EAL: Detected lcore 62 as core 11 on socket 1 00:10:30.218 EAL: Detected lcore 63 as core 16 on socket 1 00:10:30.218 EAL: Detected lcore 64 as core 17 on socket 1 00:10:30.218 EAL: Detected lcore 65 as core 18 on socket 1 00:10:30.218 EAL: Detected lcore 66 as core 19 on socket 1 00:10:30.218 EAL: Detected lcore 67 as core 20 on socket 1 00:10:30.218 EAL: Detected lcore 68 as core 24 on socket 1 00:10:30.218 EAL: Detected lcore 69 as core 25 on socket 1 00:10:30.218 EAL: Detected lcore 70 as core 26 on socket 1 00:10:30.218 EAL: Detected lcore 71 as core 27 on socket 1 00:10:30.218 EAL: Maximum logical cores by configuration: 128 00:10:30.218 EAL: Detected CPU lcores: 72 00:10:30.218 EAL: Detected NUMA nodes: 2 00:10:30.218 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:10:30.218 EAL: Detected shared linkage of DPDK 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_auxiliary.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_mlx5.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_qat.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23.0 00:10:30.218 EAL: pmd.net.i40e.init log level changed from disabled to notice 00:10:30.218 EAL: pmd.net.i40e.driver log level changed from disabled to notice 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_ipsec_mb.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_mlx5.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_isal.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_mlx5.so.23.0 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_auxiliary.so 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:10:30.218 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_mlx5.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_common_qat.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_ipsec_mb.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_crypto_mlx5.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_isal.so 00:10:30.219 EAL: open shared lib /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_compress_mlx5.so 00:10:30.219 EAL: No shared files mode enabled, IPC will be disabled 00:10:30.219 EAL: No shared files mode enabled, IPC is disabled 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:10:30.219 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:10:30.219 EAL: Bus pci wants IOVA as 'PA' 00:10:30.219 EAL: Bus auxiliary wants IOVA as 'DC' 00:10:30.219 EAL: Bus vdev wants IOVA as 'DC' 00:10:30.219 EAL: Selected IOVA mode 'PA' 00:10:30.219 EAL: Probing VFIO support... 00:10:30.219 EAL: IOMMU type 1 (Type 1) is supported 00:10:30.219 EAL: IOMMU type 7 (sPAPR) is not supported 00:10:30.219 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:10:30.219 EAL: VFIO support initialized 00:10:30.219 EAL: Ask a virtual area of 0x2e000 bytes 00:10:30.219 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:10:30.219 EAL: Setting up physically contiguous memory... 00:10:30.219 EAL: Setting maximum number of open files to 524288 00:10:30.219 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:10:30.219 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:10:30.219 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:10:30.219 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:10:30.219 EAL: Ask a virtual area of 0x61000 bytes 00:10:30.219 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:10:30.219 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:10:30.219 EAL: Ask a virtual area of 0x400000000 bytes 00:10:30.219 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:10:30.219 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:10:30.219 EAL: Hugepages will be freed exactly as allocated. 00:10:30.219 EAL: No shared files mode enabled, IPC is disabled 00:10:30.219 EAL: No shared files mode enabled, IPC is disabled 00:10:30.219 EAL: TSC frequency is ~2300000 KHz 00:10:30.219 EAL: Main lcore 0 is ready (tid=7f0a14e13b00;cpuset=[0]) 00:10:30.219 EAL: Trying to obtain current memory policy. 00:10:30.219 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.219 EAL: Restoring previous memory policy: 0 00:10:30.219 EAL: request: mp_malloc_sync 00:10:30.219 EAL: No shared files mode enabled, IPC is disabled 00:10:30.219 EAL: Heap on socket 0 was expanded by 2MB 00:10:30.219 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:10:30.219 EAL: probe driver: 8086:37c9 qat 00:10:30.219 EAL: PCI memory mapped at 0x202001000000 00:10:30.219 EAL: PCI memory mapped at 0x202001001000 00:10:30.219 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:10:30.219 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:10:30.219 EAL: probe driver: 8086:37c9 qat 00:10:30.219 EAL: PCI memory mapped at 0x202001002000 00:10:30.219 EAL: PCI memory mapped at 0x202001003000 00:10:30.219 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:10:30.219 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:10:30.219 EAL: probe driver: 8086:37c9 qat 00:10:30.219 EAL: PCI memory mapped at 0x202001004000 00:10:30.219 EAL: PCI memory mapped at 0x202001005000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001006000 00:10:30.220 EAL: PCI memory mapped at 0x202001007000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001008000 00:10:30.220 EAL: PCI memory mapped at 0x202001009000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200100a000 00:10:30.220 EAL: PCI memory mapped at 0x20200100b000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200100c000 00:10:30.220 EAL: PCI memory mapped at 0x20200100d000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200100e000 00:10:30.220 EAL: PCI memory mapped at 0x20200100f000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001010000 00:10:30.220 EAL: PCI memory mapped at 0x202001011000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001012000 00:10:30.220 EAL: PCI memory mapped at 0x202001013000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001014000 00:10:30.220 EAL: PCI memory mapped at 0x202001015000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001016000 00:10:30.220 EAL: PCI memory mapped at 0x202001017000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001018000 00:10:30.220 EAL: PCI memory mapped at 0x202001019000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200101a000 00:10:30.220 EAL: PCI memory mapped at 0x20200101b000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200101c000 00:10:30.220 EAL: PCI memory mapped at 0x20200101d000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:10:30.220 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200101e000 00:10:30.220 EAL: PCI memory mapped at 0x20200101f000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001020000 00:10:30.220 EAL: PCI memory mapped at 0x202001021000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001022000 00:10:30.220 EAL: PCI memory mapped at 0x202001023000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001024000 00:10:30.220 EAL: PCI memory mapped at 0x202001025000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001026000 00:10:30.220 EAL: PCI memory mapped at 0x202001027000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001028000 00:10:30.220 EAL: PCI memory mapped at 0x202001029000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200102a000 00:10:30.220 EAL: PCI memory mapped at 0x20200102b000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200102c000 00:10:30.220 EAL: PCI memory mapped at 0x20200102d000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200102e000 00:10:30.220 EAL: PCI memory mapped at 0x20200102f000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001030000 00:10:30.220 EAL: PCI memory mapped at 0x202001031000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001032000 00:10:30.220 EAL: PCI memory mapped at 0x202001033000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001034000 00:10:30.220 EAL: PCI memory mapped at 0x202001035000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001036000 00:10:30.220 EAL: PCI memory mapped at 0x202001037000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001038000 00:10:30.220 EAL: PCI memory mapped at 0x202001039000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200103a000 00:10:30.220 EAL: PCI memory mapped at 0x20200103b000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200103c000 00:10:30.220 EAL: PCI memory mapped at 0x20200103d000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:10:30.220 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x20200103e000 00:10:30.220 EAL: PCI memory mapped at 0x20200103f000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:10:30.220 EAL: PCI device 0000:41:00.0 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37d2 net_i40e 00:10:30.220 EAL: Not managed by a supported kernel driver, skipped 00:10:30.220 EAL: PCI device 0000:41:00.1 on NUMA socket 0 00:10:30.220 EAL: probe driver: 8086:37d2 net_i40e 00:10:30.220 EAL: Not managed by a supported kernel driver, skipped 00:10:30.220 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001040000 00:10:30.220 EAL: PCI memory mapped at 0x202001041000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:10:30.220 EAL: Trying to obtain current memory policy. 00:10:30.220 EAL: Setting policy MPOL_PREFERRED for socket 1 00:10:30.220 EAL: Restoring previous memory policy: 4 00:10:30.220 EAL: request: mp_malloc_sync 00:10:30.220 EAL: No shared files mode enabled, IPC is disabled 00:10:30.220 EAL: Heap on socket 1 was expanded by 2MB 00:10:30.220 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.220 EAL: PCI memory mapped at 0x202001042000 00:10:30.220 EAL: PCI memory mapped at 0x202001043000 00:10:30.220 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:10:30.220 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:10:30.220 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001044000 00:10:30.221 EAL: PCI memory mapped at 0x202001045000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001046000 00:10:30.221 EAL: PCI memory mapped at 0x202001047000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001048000 00:10:30.221 EAL: PCI memory mapped at 0x202001049000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x20200104a000 00:10:30.221 EAL: PCI memory mapped at 0x20200104b000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x20200104c000 00:10:30.221 EAL: PCI memory mapped at 0x20200104d000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x20200104e000 00:10:30.221 EAL: PCI memory mapped at 0x20200104f000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001050000 00:10:30.221 EAL: PCI memory mapped at 0x202001051000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001052000 00:10:30.221 EAL: PCI memory mapped at 0x202001053000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001054000 00:10:30.221 EAL: PCI memory mapped at 0x202001055000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001056000 00:10:30.221 EAL: PCI memory mapped at 0x202001057000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x202001058000 00:10:30.221 EAL: PCI memory mapped at 0x202001059000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x20200105a000 00:10:30.221 EAL: PCI memory mapped at 0x20200105b000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x20200105c000 00:10:30.221 EAL: PCI memory mapped at 0x20200105d000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:10:30.221 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:10:30.221 EAL: probe driver: 8086:37c9 qat 00:10:30.221 EAL: PCI memory mapped at 0x20200105e000 00:10:30.221 EAL: PCI memory mapped at 0x20200105f000 00:10:30.221 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:10:30.221 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: No PCI address specified using 'addr=' in: bus=pci 00:10:30.481 EAL: Mem event callback 'spdk:(nil)' registered 00:10:30.481 00:10:30.481 00:10:30.481 CUnit - A unit testing framework for C - Version 2.1-3 00:10:30.481 http://cunit.sourceforge.net/ 00:10:30.481 00:10:30.481 00:10:30.481 Suite: components_suite 00:10:30.481 Test: vtophys_malloc_test ...passed 00:10:30.481 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 4MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 4MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 6MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 6MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 10MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 10MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 18MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 18MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 34MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 34MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 66MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 66MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 130MB 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was shrunk by 130MB 00:10:30.481 EAL: Trying to obtain current memory policy. 00:10:30.481 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.481 EAL: Restoring previous memory policy: 4 00:10:30.481 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.481 EAL: request: mp_malloc_sync 00:10:30.481 EAL: No shared files mode enabled, IPC is disabled 00:10:30.481 EAL: Heap on socket 0 was expanded by 258MB 00:10:30.740 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.740 EAL: request: mp_malloc_sync 00:10:30.740 EAL: No shared files mode enabled, IPC is disabled 00:10:30.740 EAL: Heap on socket 0 was shrunk by 258MB 00:10:30.740 EAL: Trying to obtain current memory policy. 00:10:30.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:30.740 EAL: Restoring previous memory policy: 4 00:10:30.740 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.740 EAL: request: mp_malloc_sync 00:10:30.740 EAL: No shared files mode enabled, IPC is disabled 00:10:30.740 EAL: Heap on socket 0 was expanded by 514MB 00:10:30.999 EAL: Calling mem event callback 'spdk:(nil)' 00:10:30.999 EAL: request: mp_malloc_sync 00:10:30.999 EAL: No shared files mode enabled, IPC is disabled 00:10:30.999 EAL: Heap on socket 0 was shrunk by 514MB 00:10:30.999 EAL: Trying to obtain current memory policy. 00:10:30.999 EAL: Setting policy MPOL_PREFERRED for socket 0 00:10:31.258 EAL: Restoring previous memory policy: 4 00:10:31.258 EAL: Calling mem event callback 'spdk:(nil)' 00:10:31.258 EAL: request: mp_malloc_sync 00:10:31.258 EAL: No shared files mode enabled, IPC is disabled 00:10:31.258 EAL: Heap on socket 0 was expanded by 1026MB 00:10:31.517 EAL: Calling mem event callback 'spdk:(nil)' 00:10:31.517 EAL: request: mp_malloc_sync 00:10:31.517 EAL: No shared files mode enabled, IPC is disabled 00:10:31.517 EAL: Heap on socket 0 was shrunk by 1026MB 00:10:31.517 passed 00:10:31.517 00:10:31.517 Run Summary: Type Total Ran Passed Failed Inactive 00:10:31.517 suites 1 1 n/a 0 0 00:10:31.518 tests 2 2 2 0 0 00:10:31.518 asserts 5946 5946 5946 0 n/a 00:10:31.518 00:10:31.518 Elapsed time = 1.181 seconds 00:10:31.518 EAL: No shared files mode enabled, IPC is disabled 00:10:31.518 EAL: No shared files mode enabled, IPC is disabled 00:10:31.518 EAL: No shared files mode enabled, IPC is disabled 00:10:31.518 00:10:31.518 real 0m1.382s 00:10:31.518 user 0m0.772s 00:10:31.518 sys 0m0.577s 00:10:31.518 17:04:26 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:31.518 17:04:26 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:10:31.518 ************************************ 00:10:31.518 END TEST env_vtophys 00:10:31.518 ************************************ 00:10:31.777 17:04:26 env -- common/autotest_common.sh@1142 -- # return 0 00:10:31.777 17:04:26 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:10:31.777 17:04:26 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:31.777 17:04:26 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:31.777 17:04:26 env -- common/autotest_common.sh@10 -- # set +x 00:10:31.777 ************************************ 00:10:31.777 START TEST env_pci 00:10:31.777 ************************************ 00:10:31.777 17:04:26 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:10:31.777 00:10:31.777 00:10:31.777 CUnit - A unit testing framework for C - Version 2.1-3 00:10:31.777 http://cunit.sourceforge.net/ 00:10:31.777 00:10:31.777 00:10:31.777 Suite: pci 00:10:31.777 Test: pci_hook ...[2024-07-23 17:04:27.013317] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 4066678 has claimed it 00:10:31.777 EAL: Cannot find device (10000:00:01.0) 00:10:31.777 EAL: Failed to attach device on primary process 00:10:31.777 passed 00:10:31.777 00:10:31.777 Run Summary: Type Total Ran Passed Failed Inactive 00:10:31.777 suites 1 1 n/a 0 0 00:10:31.777 tests 1 1 1 0 0 00:10:31.777 asserts 25 25 25 0 n/a 00:10:31.777 00:10:31.777 Elapsed time = 0.042 seconds 00:10:31.777 00:10:31.777 real 0m0.069s 00:10:31.777 user 0m0.018s 00:10:31.777 sys 0m0.050s 00:10:31.777 17:04:27 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:31.777 17:04:27 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:10:31.777 ************************************ 00:10:31.777 END TEST env_pci 00:10:31.777 ************************************ 00:10:31.777 17:04:27 env -- common/autotest_common.sh@1142 -- # return 0 00:10:31.777 17:04:27 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:10:31.777 17:04:27 env -- env/env.sh@15 -- # uname 00:10:31.777 17:04:27 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:10:31.777 17:04:27 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:10:31.777 17:04:27 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:10:31.777 17:04:27 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:31.777 17:04:27 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:31.777 17:04:27 env -- common/autotest_common.sh@10 -- # set +x 00:10:31.777 ************************************ 00:10:31.777 START TEST env_dpdk_post_init 00:10:31.777 ************************************ 00:10:31.777 17:04:27 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:10:31.777 EAL: Detected CPU lcores: 72 00:10:31.777 EAL: Detected NUMA nodes: 2 00:10:31.777 EAL: Detected shared linkage of DPDK 00:10:31.777 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:10:32.037 EAL: Selected IOVA mode 'PA' 00:10:32.037 EAL: VFIO support initialized 00:10:32.037 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.037 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.037 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.037 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.037 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.037 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:10:32.037 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.037 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:10:32.038 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.038 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:10:32.038 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:10:32.039 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:10:32.039 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:10:32.039 TELEMETRY: No legacy callbacks, legacy socket not created 00:10:32.039 EAL: Using IOMMU type 1 (Type 1) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:10:32.039 EAL: Ignore mapping IO port bar(1) 00:10:32.039 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:10:32.298 EAL: Ignore mapping IO port bar(1) 00:10:32.298 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:10:32.298 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:10:32.298 EAL: Ignore mapping IO port bar(1) 00:10:32.298 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:10:32.298 EAL: Ignore mapping IO port bar(1) 00:10:32.298 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Ignore mapping IO port bar(5) 00:10:32.556 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:10:32.556 EAL: Ignore mapping IO port bar(1) 00:10:32.556 EAL: Ignore mapping IO port bar(5) 00:10:32.556 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:10:35.086 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:10:35.086 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:10:35.086 Starting DPDK initialization... 00:10:35.086 Starting SPDK post initialization... 00:10:35.086 SPDK NVMe probe 00:10:35.086 Attaching to 0000:5e:00.0 00:10:35.086 Attached to 0000:5e:00.0 00:10:35.086 Cleaning up... 00:10:35.086 00:10:35.086 real 0m3.284s 00:10:35.086 user 0m2.230s 00:10:35.086 sys 0m0.615s 00:10:35.086 17:04:30 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:35.086 17:04:30 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:10:35.086 ************************************ 00:10:35.086 END TEST env_dpdk_post_init 00:10:35.086 ************************************ 00:10:35.086 17:04:30 env -- common/autotest_common.sh@1142 -- # return 0 00:10:35.086 17:04:30 env -- env/env.sh@26 -- # uname 00:10:35.086 17:04:30 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:10:35.086 17:04:30 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:10:35.086 17:04:30 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:35.086 17:04:30 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:35.086 17:04:30 env -- common/autotest_common.sh@10 -- # set +x 00:10:35.346 ************************************ 00:10:35.346 START TEST env_mem_callbacks 00:10:35.346 ************************************ 00:10:35.346 17:04:30 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:10:35.346 EAL: Detected CPU lcores: 72 00:10:35.346 EAL: Detected NUMA nodes: 2 00:10:35.346 EAL: Detected shared linkage of DPDK 00:10:35.346 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:10:35.346 EAL: Selected IOVA mode 'PA' 00:10:35.346 EAL: VFIO support initialized 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:10:35.346 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.346 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:10:35.346 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.347 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:10:35.347 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.347 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:10:35.348 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:10:35.348 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:10:35.348 TELEMETRY: No legacy callbacks, legacy socket not created 00:10:35.348 00:10:35.348 00:10:35.348 CUnit - A unit testing framework for C - Version 2.1-3 00:10:35.348 http://cunit.sourceforge.net/ 00:10:35.348 00:10:35.348 00:10:35.348 Suite: memory 00:10:35.348 Test: test ... 00:10:35.348 register 0x200000200000 2097152 00:10:35.348 register 0x201000a00000 2097152 00:10:35.348 malloc 3145728 00:10:35.348 register 0x200000400000 4194304 00:10:35.348 buf 0x200000500000 len 3145728 PASSED 00:10:35.348 malloc 64 00:10:35.348 buf 0x2000004fff40 len 64 PASSED 00:10:35.348 malloc 4194304 00:10:35.348 register 0x200000800000 6291456 00:10:35.348 buf 0x200000a00000 len 4194304 PASSED 00:10:35.348 free 0x200000500000 3145728 00:10:35.348 free 0x2000004fff40 64 00:10:35.348 unregister 0x200000400000 4194304 PASSED 00:10:35.348 free 0x200000a00000 4194304 00:10:35.348 unregister 0x200000800000 6291456 PASSED 00:10:35.348 malloc 8388608 00:10:35.348 register 0x200000400000 10485760 00:10:35.348 buf 0x200000600000 len 8388608 PASSED 00:10:35.348 free 0x200000600000 8388608 00:10:35.348 unregister 0x200000400000 10485760 PASSED 00:10:35.348 passed 00:10:35.348 00:10:35.348 Run Summary: Type Total Ran Passed Failed Inactive 00:10:35.348 suites 1 1 n/a 0 0 00:10:35.348 tests 1 1 1 0 0 00:10:35.348 asserts 16 16 16 0 n/a 00:10:35.348 00:10:35.348 Elapsed time = 0.007 seconds 00:10:35.348 00:10:35.348 real 0m0.109s 00:10:35.348 user 0m0.032s 00:10:35.348 sys 0m0.076s 00:10:35.348 17:04:30 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:35.348 17:04:30 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:10:35.348 ************************************ 00:10:35.348 END TEST env_mem_callbacks 00:10:35.348 ************************************ 00:10:35.348 17:04:30 env -- common/autotest_common.sh@1142 -- # return 0 00:10:35.348 00:10:35.348 real 0m5.616s 00:10:35.348 user 0m3.457s 00:10:35.348 sys 0m1.727s 00:10:35.348 17:04:30 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:35.348 17:04:30 env -- common/autotest_common.sh@10 -- # set +x 00:10:35.348 ************************************ 00:10:35.348 END TEST env 00:10:35.348 ************************************ 00:10:35.348 17:04:30 -- common/autotest_common.sh@1142 -- # return 0 00:10:35.348 17:04:30 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:10:35.348 17:04:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:35.348 17:04:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:35.348 17:04:30 -- common/autotest_common.sh@10 -- # set +x 00:10:35.348 ************************************ 00:10:35.348 START TEST rpc 00:10:35.348 ************************************ 00:10:35.348 17:04:30 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:10:35.607 * Looking for test storage... 00:10:35.607 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:10:35.607 17:04:30 rpc -- rpc/rpc.sh@65 -- # spdk_pid=4067328 00:10:35.607 17:04:30 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:10:35.607 17:04:30 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:10:35.607 17:04:30 rpc -- rpc/rpc.sh@67 -- # waitforlisten 4067328 00:10:35.607 17:04:30 rpc -- common/autotest_common.sh@829 -- # '[' -z 4067328 ']' 00:10:35.607 17:04:30 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:35.607 17:04:30 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:35.607 17:04:30 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:35.607 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:35.607 17:04:30 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:35.607 17:04:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:10:35.607 [2024-07-23 17:04:30.940071] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:10:35.607 [2024-07-23 17:04:30.940148] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4067328 ] 00:10:35.865 [2024-07-23 17:04:31.073163] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:35.865 [2024-07-23 17:04:31.128210] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:10:35.865 [2024-07-23 17:04:31.128258] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 4067328' to capture a snapshot of events at runtime. 00:10:35.865 [2024-07-23 17:04:31.128273] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:10:35.865 [2024-07-23 17:04:31.128286] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:10:35.865 [2024-07-23 17:04:31.128296] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid4067328 for offline analysis/debug. 00:10:35.865 [2024-07-23 17:04:31.128333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:36.801 17:04:31 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:36.801 17:04:31 rpc -- common/autotest_common.sh@862 -- # return 0 00:10:36.801 17:04:31 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:10:36.801 17:04:31 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:10:36.801 17:04:31 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:10:36.801 17:04:31 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:10:36.801 17:04:31 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:36.801 17:04:31 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:36.801 17:04:31 rpc -- common/autotest_common.sh@10 -- # set +x 00:10:36.801 ************************************ 00:10:36.801 START TEST rpc_integrity 00:10:36.801 ************************************ 00:10:36.801 17:04:31 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:10:36.801 17:04:31 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:36.801 17:04:31 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.801 17:04:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.801 17:04:31 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.801 17:04:31 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:10:36.801 17:04:31 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:10:36.801 { 00:10:36.801 "name": "Malloc0", 00:10:36.801 "aliases": [ 00:10:36.801 "b5343344-26f7-4ed0-8e74-d9c262995810" 00:10:36.801 ], 00:10:36.801 "product_name": "Malloc disk", 00:10:36.801 "block_size": 512, 00:10:36.801 "num_blocks": 16384, 00:10:36.801 "uuid": "b5343344-26f7-4ed0-8e74-d9c262995810", 00:10:36.801 "assigned_rate_limits": { 00:10:36.801 "rw_ios_per_sec": 0, 00:10:36.801 "rw_mbytes_per_sec": 0, 00:10:36.801 "r_mbytes_per_sec": 0, 00:10:36.801 "w_mbytes_per_sec": 0 00:10:36.801 }, 00:10:36.801 "claimed": false, 00:10:36.801 "zoned": false, 00:10:36.801 "supported_io_types": { 00:10:36.801 "read": true, 00:10:36.801 "write": true, 00:10:36.801 "unmap": true, 00:10:36.801 "flush": true, 00:10:36.801 "reset": true, 00:10:36.801 "nvme_admin": false, 00:10:36.801 "nvme_io": false, 00:10:36.801 "nvme_io_md": false, 00:10:36.801 "write_zeroes": true, 00:10:36.801 "zcopy": true, 00:10:36.801 "get_zone_info": false, 00:10:36.801 "zone_management": false, 00:10:36.801 "zone_append": false, 00:10:36.801 "compare": false, 00:10:36.801 "compare_and_write": false, 00:10:36.801 "abort": true, 00:10:36.801 "seek_hole": false, 00:10:36.801 "seek_data": false, 00:10:36.801 "copy": true, 00:10:36.801 "nvme_iov_md": false 00:10:36.801 }, 00:10:36.801 "memory_domains": [ 00:10:36.801 { 00:10:36.801 "dma_device_id": "system", 00:10:36.801 "dma_device_type": 1 00:10:36.801 }, 00:10:36.801 { 00:10:36.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.801 "dma_device_type": 2 00:10:36.801 } 00:10:36.801 ], 00:10:36.801 "driver_specific": {} 00:10:36.801 } 00:10:36.801 ]' 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.801 [2024-07-23 17:04:32.101267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:10:36.801 [2024-07-23 17:04:32.101306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:36.801 [2024-07-23 17:04:32.101326] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2006870 00:10:36.801 [2024-07-23 17:04:32.101338] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:36.801 [2024-07-23 17:04:32.102848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:36.801 [2024-07-23 17:04:32.102875] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:10:36.801 Passthru0 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.801 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.801 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:10:36.801 { 00:10:36.801 "name": "Malloc0", 00:10:36.801 "aliases": [ 00:10:36.801 "b5343344-26f7-4ed0-8e74-d9c262995810" 00:10:36.801 ], 00:10:36.801 "product_name": "Malloc disk", 00:10:36.801 "block_size": 512, 00:10:36.801 "num_blocks": 16384, 00:10:36.801 "uuid": "b5343344-26f7-4ed0-8e74-d9c262995810", 00:10:36.801 "assigned_rate_limits": { 00:10:36.801 "rw_ios_per_sec": 0, 00:10:36.801 "rw_mbytes_per_sec": 0, 00:10:36.801 "r_mbytes_per_sec": 0, 00:10:36.802 "w_mbytes_per_sec": 0 00:10:36.802 }, 00:10:36.802 "claimed": true, 00:10:36.802 "claim_type": "exclusive_write", 00:10:36.802 "zoned": false, 00:10:36.802 "supported_io_types": { 00:10:36.802 "read": true, 00:10:36.802 "write": true, 00:10:36.802 "unmap": true, 00:10:36.802 "flush": true, 00:10:36.802 "reset": true, 00:10:36.802 "nvme_admin": false, 00:10:36.802 "nvme_io": false, 00:10:36.802 "nvme_io_md": false, 00:10:36.802 "write_zeroes": true, 00:10:36.802 "zcopy": true, 00:10:36.802 "get_zone_info": false, 00:10:36.802 "zone_management": false, 00:10:36.802 "zone_append": false, 00:10:36.802 "compare": false, 00:10:36.802 "compare_and_write": false, 00:10:36.802 "abort": true, 00:10:36.802 "seek_hole": false, 00:10:36.802 "seek_data": false, 00:10:36.802 "copy": true, 00:10:36.802 "nvme_iov_md": false 00:10:36.802 }, 00:10:36.802 "memory_domains": [ 00:10:36.802 { 00:10:36.802 "dma_device_id": "system", 00:10:36.802 "dma_device_type": 1 00:10:36.802 }, 00:10:36.802 { 00:10:36.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.802 "dma_device_type": 2 00:10:36.802 } 00:10:36.802 ], 00:10:36.802 "driver_specific": {} 00:10:36.802 }, 00:10:36.802 { 00:10:36.802 "name": "Passthru0", 00:10:36.802 "aliases": [ 00:10:36.802 "7f3ab17d-5e9a-5eef-b4fa-dd3caaa126e3" 00:10:36.802 ], 00:10:36.802 "product_name": "passthru", 00:10:36.802 "block_size": 512, 00:10:36.802 "num_blocks": 16384, 00:10:36.802 "uuid": "7f3ab17d-5e9a-5eef-b4fa-dd3caaa126e3", 00:10:36.802 "assigned_rate_limits": { 00:10:36.802 "rw_ios_per_sec": 0, 00:10:36.802 "rw_mbytes_per_sec": 0, 00:10:36.802 "r_mbytes_per_sec": 0, 00:10:36.802 "w_mbytes_per_sec": 0 00:10:36.802 }, 00:10:36.802 "claimed": false, 00:10:36.802 "zoned": false, 00:10:36.802 "supported_io_types": { 00:10:36.802 "read": true, 00:10:36.802 "write": true, 00:10:36.802 "unmap": true, 00:10:36.802 "flush": true, 00:10:36.802 "reset": true, 00:10:36.802 "nvme_admin": false, 00:10:36.802 "nvme_io": false, 00:10:36.802 "nvme_io_md": false, 00:10:36.802 "write_zeroes": true, 00:10:36.802 "zcopy": true, 00:10:36.802 "get_zone_info": false, 00:10:36.802 "zone_management": false, 00:10:36.802 "zone_append": false, 00:10:36.802 "compare": false, 00:10:36.802 "compare_and_write": false, 00:10:36.802 "abort": true, 00:10:36.802 "seek_hole": false, 00:10:36.802 "seek_data": false, 00:10:36.802 "copy": true, 00:10:36.802 "nvme_iov_md": false 00:10:36.802 }, 00:10:36.802 "memory_domains": [ 00:10:36.802 { 00:10:36.802 "dma_device_id": "system", 00:10:36.802 "dma_device_type": 1 00:10:36.802 }, 00:10:36.802 { 00:10:36.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.802 "dma_device_type": 2 00:10:36.802 } 00:10:36.802 ], 00:10:36.802 "driver_specific": { 00:10:36.802 "passthru": { 00:10:36.802 "name": "Passthru0", 00:10:36.802 "base_bdev_name": "Malloc0" 00:10:36.802 } 00:10:36.802 } 00:10:36.802 } 00:10:36.802 ]' 00:10:36.802 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:10:36.802 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:10:36.802 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.802 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:36.802 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:36.802 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.061 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:10:37.061 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:10:37.061 17:04:32 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:10:37.061 00:10:37.061 real 0m0.321s 00:10:37.061 user 0m0.195s 00:10:37.061 sys 0m0.058s 00:10:37.061 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.061 17:04:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 ************************************ 00:10:37.061 END TEST rpc_integrity 00:10:37.061 ************************************ 00:10:37.061 17:04:32 rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:37.061 17:04:32 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:10:37.061 17:04:32 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:37.061 17:04:32 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.061 17:04:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 ************************************ 00:10:37.061 START TEST rpc_plugins 00:10:37.061 ************************************ 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:10:37.061 { 00:10:37.061 "name": "Malloc1", 00:10:37.061 "aliases": [ 00:10:37.061 "b6451db2-164c-4905-978c-f95e9ef045ba" 00:10:37.061 ], 00:10:37.061 "product_name": "Malloc disk", 00:10:37.061 "block_size": 4096, 00:10:37.061 "num_blocks": 256, 00:10:37.061 "uuid": "b6451db2-164c-4905-978c-f95e9ef045ba", 00:10:37.061 "assigned_rate_limits": { 00:10:37.061 "rw_ios_per_sec": 0, 00:10:37.061 "rw_mbytes_per_sec": 0, 00:10:37.061 "r_mbytes_per_sec": 0, 00:10:37.061 "w_mbytes_per_sec": 0 00:10:37.061 }, 00:10:37.061 "claimed": false, 00:10:37.061 "zoned": false, 00:10:37.061 "supported_io_types": { 00:10:37.061 "read": true, 00:10:37.061 "write": true, 00:10:37.061 "unmap": true, 00:10:37.061 "flush": true, 00:10:37.061 "reset": true, 00:10:37.061 "nvme_admin": false, 00:10:37.061 "nvme_io": false, 00:10:37.061 "nvme_io_md": false, 00:10:37.061 "write_zeroes": true, 00:10:37.061 "zcopy": true, 00:10:37.061 "get_zone_info": false, 00:10:37.061 "zone_management": false, 00:10:37.061 "zone_append": false, 00:10:37.061 "compare": false, 00:10:37.061 "compare_and_write": false, 00:10:37.061 "abort": true, 00:10:37.061 "seek_hole": false, 00:10:37.061 "seek_data": false, 00:10:37.061 "copy": true, 00:10:37.061 "nvme_iov_md": false 00:10:37.061 }, 00:10:37.061 "memory_domains": [ 00:10:37.061 { 00:10:37.061 "dma_device_id": "system", 00:10:37.061 "dma_device_type": 1 00:10:37.061 }, 00:10:37.061 { 00:10:37.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.061 "dma_device_type": 2 00:10:37.061 } 00:10:37.061 ], 00:10:37.061 "driver_specific": {} 00:10:37.061 } 00:10:37.061 ]' 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:10:37.061 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:10:37.061 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:10:37.320 17:04:32 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:10:37.320 00:10:37.320 real 0m0.152s 00:10:37.320 user 0m0.094s 00:10:37.320 sys 0m0.028s 00:10:37.320 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.320 17:04:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:10:37.320 ************************************ 00:10:37.320 END TEST rpc_plugins 00:10:37.320 ************************************ 00:10:37.320 17:04:32 rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:37.320 17:04:32 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:10:37.320 17:04:32 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:37.320 17:04:32 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.320 17:04:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:10:37.320 ************************************ 00:10:37.320 START TEST rpc_trace_cmd_test 00:10:37.320 ************************************ 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.320 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:10:37.320 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid4067328", 00:10:37.320 "tpoint_group_mask": "0x8", 00:10:37.321 "iscsi_conn": { 00:10:37.321 "mask": "0x2", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "scsi": { 00:10:37.321 "mask": "0x4", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "bdev": { 00:10:37.321 "mask": "0x8", 00:10:37.321 "tpoint_mask": "0xffffffffffffffff" 00:10:37.321 }, 00:10:37.321 "nvmf_rdma": { 00:10:37.321 "mask": "0x10", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "nvmf_tcp": { 00:10:37.321 "mask": "0x20", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "ftl": { 00:10:37.321 "mask": "0x40", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "blobfs": { 00:10:37.321 "mask": "0x80", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "dsa": { 00:10:37.321 "mask": "0x200", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "thread": { 00:10:37.321 "mask": "0x400", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "nvme_pcie": { 00:10:37.321 "mask": "0x800", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "iaa": { 00:10:37.321 "mask": "0x1000", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "nvme_tcp": { 00:10:37.321 "mask": "0x2000", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "bdev_nvme": { 00:10:37.321 "mask": "0x4000", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 }, 00:10:37.321 "sock": { 00:10:37.321 "mask": "0x8000", 00:10:37.321 "tpoint_mask": "0x0" 00:10:37.321 } 00:10:37.321 }' 00:10:37.321 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:10:37.321 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:10:37.321 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:10:37.321 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:10:37.321 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:10:37.580 00:10:37.580 real 0m0.252s 00:10:37.580 user 0m0.210s 00:10:37.580 sys 0m0.036s 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.580 17:04:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.580 ************************************ 00:10:37.580 END TEST rpc_trace_cmd_test 00:10:37.580 ************************************ 00:10:37.580 17:04:32 rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:37.580 17:04:32 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:10:37.580 17:04:32 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:10:37.580 17:04:32 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:10:37.580 17:04:32 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:37.580 17:04:32 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:37.580 17:04:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:10:37.580 ************************************ 00:10:37.580 START TEST rpc_daemon_integrity 00:10:37.580 ************************************ 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.580 17:04:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:10:37.840 { 00:10:37.840 "name": "Malloc2", 00:10:37.840 "aliases": [ 00:10:37.840 "e2f17337-1f22-4334-a9b9-53de9a7654cc" 00:10:37.840 ], 00:10:37.840 "product_name": "Malloc disk", 00:10:37.840 "block_size": 512, 00:10:37.840 "num_blocks": 16384, 00:10:37.840 "uuid": "e2f17337-1f22-4334-a9b9-53de9a7654cc", 00:10:37.840 "assigned_rate_limits": { 00:10:37.840 "rw_ios_per_sec": 0, 00:10:37.840 "rw_mbytes_per_sec": 0, 00:10:37.840 "r_mbytes_per_sec": 0, 00:10:37.840 "w_mbytes_per_sec": 0 00:10:37.840 }, 00:10:37.840 "claimed": false, 00:10:37.840 "zoned": false, 00:10:37.840 "supported_io_types": { 00:10:37.840 "read": true, 00:10:37.840 "write": true, 00:10:37.840 "unmap": true, 00:10:37.840 "flush": true, 00:10:37.840 "reset": true, 00:10:37.840 "nvme_admin": false, 00:10:37.840 "nvme_io": false, 00:10:37.840 "nvme_io_md": false, 00:10:37.840 "write_zeroes": true, 00:10:37.840 "zcopy": true, 00:10:37.840 "get_zone_info": false, 00:10:37.840 "zone_management": false, 00:10:37.840 "zone_append": false, 00:10:37.840 "compare": false, 00:10:37.840 "compare_and_write": false, 00:10:37.840 "abort": true, 00:10:37.840 "seek_hole": false, 00:10:37.840 "seek_data": false, 00:10:37.840 "copy": true, 00:10:37.840 "nvme_iov_md": false 00:10:37.840 }, 00:10:37.840 "memory_domains": [ 00:10:37.840 { 00:10:37.840 "dma_device_id": "system", 00:10:37.840 "dma_device_type": 1 00:10:37.840 }, 00:10:37.840 { 00:10:37.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.840 "dma_device_type": 2 00:10:37.840 } 00:10:37.840 ], 00:10:37.840 "driver_specific": {} 00:10:37.840 } 00:10:37.840 ]' 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.840 [2024-07-23 17:04:33.080041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:10:37.840 [2024-07-23 17:04:33.080079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:37.840 [2024-07-23 17:04:33.080107] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fbef10 00:10:37.840 [2024-07-23 17:04:33.080120] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:37.840 [2024-07-23 17:04:33.081479] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:37.840 [2024-07-23 17:04:33.081504] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:10:37.840 Passthru0 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.840 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:10:37.840 { 00:10:37.840 "name": "Malloc2", 00:10:37.840 "aliases": [ 00:10:37.840 "e2f17337-1f22-4334-a9b9-53de9a7654cc" 00:10:37.840 ], 00:10:37.840 "product_name": "Malloc disk", 00:10:37.840 "block_size": 512, 00:10:37.840 "num_blocks": 16384, 00:10:37.840 "uuid": "e2f17337-1f22-4334-a9b9-53de9a7654cc", 00:10:37.840 "assigned_rate_limits": { 00:10:37.840 "rw_ios_per_sec": 0, 00:10:37.840 "rw_mbytes_per_sec": 0, 00:10:37.840 "r_mbytes_per_sec": 0, 00:10:37.840 "w_mbytes_per_sec": 0 00:10:37.840 }, 00:10:37.840 "claimed": true, 00:10:37.840 "claim_type": "exclusive_write", 00:10:37.840 "zoned": false, 00:10:37.840 "supported_io_types": { 00:10:37.840 "read": true, 00:10:37.840 "write": true, 00:10:37.840 "unmap": true, 00:10:37.840 "flush": true, 00:10:37.840 "reset": true, 00:10:37.840 "nvme_admin": false, 00:10:37.840 "nvme_io": false, 00:10:37.840 "nvme_io_md": false, 00:10:37.840 "write_zeroes": true, 00:10:37.840 "zcopy": true, 00:10:37.840 "get_zone_info": false, 00:10:37.840 "zone_management": false, 00:10:37.840 "zone_append": false, 00:10:37.840 "compare": false, 00:10:37.840 "compare_and_write": false, 00:10:37.840 "abort": true, 00:10:37.840 "seek_hole": false, 00:10:37.840 "seek_data": false, 00:10:37.840 "copy": true, 00:10:37.840 "nvme_iov_md": false 00:10:37.840 }, 00:10:37.840 "memory_domains": [ 00:10:37.840 { 00:10:37.840 "dma_device_id": "system", 00:10:37.840 "dma_device_type": 1 00:10:37.840 }, 00:10:37.840 { 00:10:37.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.840 "dma_device_type": 2 00:10:37.840 } 00:10:37.840 ], 00:10:37.840 "driver_specific": {} 00:10:37.840 }, 00:10:37.840 { 00:10:37.840 "name": "Passthru0", 00:10:37.840 "aliases": [ 00:10:37.840 "8c1fbb20-d48f-56a7-bf03-47c47372a916" 00:10:37.840 ], 00:10:37.840 "product_name": "passthru", 00:10:37.840 "block_size": 512, 00:10:37.840 "num_blocks": 16384, 00:10:37.840 "uuid": "8c1fbb20-d48f-56a7-bf03-47c47372a916", 00:10:37.840 "assigned_rate_limits": { 00:10:37.840 "rw_ios_per_sec": 0, 00:10:37.840 "rw_mbytes_per_sec": 0, 00:10:37.840 "r_mbytes_per_sec": 0, 00:10:37.840 "w_mbytes_per_sec": 0 00:10:37.840 }, 00:10:37.840 "claimed": false, 00:10:37.840 "zoned": false, 00:10:37.840 "supported_io_types": { 00:10:37.840 "read": true, 00:10:37.840 "write": true, 00:10:37.840 "unmap": true, 00:10:37.840 "flush": true, 00:10:37.840 "reset": true, 00:10:37.840 "nvme_admin": false, 00:10:37.840 "nvme_io": false, 00:10:37.840 "nvme_io_md": false, 00:10:37.840 "write_zeroes": true, 00:10:37.840 "zcopy": true, 00:10:37.840 "get_zone_info": false, 00:10:37.840 "zone_management": false, 00:10:37.840 "zone_append": false, 00:10:37.840 "compare": false, 00:10:37.840 "compare_and_write": false, 00:10:37.841 "abort": true, 00:10:37.841 "seek_hole": false, 00:10:37.841 "seek_data": false, 00:10:37.841 "copy": true, 00:10:37.841 "nvme_iov_md": false 00:10:37.841 }, 00:10:37.841 "memory_domains": [ 00:10:37.841 { 00:10:37.841 "dma_device_id": "system", 00:10:37.841 "dma_device_type": 1 00:10:37.841 }, 00:10:37.841 { 00:10:37.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.841 "dma_device_type": 2 00:10:37.841 } 00:10:37.841 ], 00:10:37.841 "driver_specific": { 00:10:37.841 "passthru": { 00:10:37.841 "name": "Passthru0", 00:10:37.841 "base_bdev_name": "Malloc2" 00:10:37.841 } 00:10:37.841 } 00:10:37.841 } 00:10:37.841 ]' 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:10:37.841 00:10:37.841 real 0m0.311s 00:10:37.841 user 0m0.193s 00:10:37.841 sys 0m0.051s 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.841 17:04:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:10:37.841 ************************************ 00:10:37.841 END TEST rpc_daemon_integrity 00:10:37.841 ************************************ 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:38.100 17:04:33 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:10:38.100 17:04:33 rpc -- rpc/rpc.sh@84 -- # killprocess 4067328 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@948 -- # '[' -z 4067328 ']' 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@952 -- # kill -0 4067328 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@953 -- # uname 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4067328 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4067328' 00:10:38.100 killing process with pid 4067328 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@967 -- # kill 4067328 00:10:38.100 17:04:33 rpc -- common/autotest_common.sh@972 -- # wait 4067328 00:10:38.359 00:10:38.359 real 0m2.926s 00:10:38.359 user 0m3.713s 00:10:38.359 sys 0m0.995s 00:10:38.359 17:04:33 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:38.359 17:04:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:10:38.359 ************************************ 00:10:38.359 END TEST rpc 00:10:38.359 ************************************ 00:10:38.359 17:04:33 -- common/autotest_common.sh@1142 -- # return 0 00:10:38.359 17:04:33 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:10:38.359 17:04:33 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:38.359 17:04:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.359 17:04:33 -- common/autotest_common.sh@10 -- # set +x 00:10:38.359 ************************************ 00:10:38.359 START TEST skip_rpc 00:10:38.359 ************************************ 00:10:38.359 17:04:33 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:10:38.618 * Looking for test storage... 00:10:38.618 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:10:38.618 17:04:33 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:10:38.618 17:04:33 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:10:38.618 17:04:33 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:10:38.618 17:04:33 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:38.618 17:04:33 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.618 17:04:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:38.618 ************************************ 00:10:38.618 START TEST skip_rpc 00:10:38.618 ************************************ 00:10:38.618 17:04:33 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:10:38.618 17:04:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=4067859 00:10:38.618 17:04:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:10:38.618 17:04:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:10:38.618 17:04:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:10:38.618 [2024-07-23 17:04:33.988174] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:10:38.619 [2024-07-23 17:04:33.988236] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4067859 ] 00:10:38.877 [2024-07-23 17:04:34.118691] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:38.877 [2024-07-23 17:04:34.169404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 4067859 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 4067859 ']' 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 4067859 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4067859 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4067859' 00:10:44.153 killing process with pid 4067859 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 4067859 00:10:44.153 17:04:38 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 4067859 00:10:44.153 00:10:44.153 real 0m5.426s 00:10:44.153 user 0m5.101s 00:10:44.153 sys 0m0.343s 00:10:44.153 17:04:39 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.153 17:04:39 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:44.153 ************************************ 00:10:44.153 END TEST skip_rpc 00:10:44.153 ************************************ 00:10:44.153 17:04:39 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:44.153 17:04:39 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:10:44.153 17:04:39 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:44.153 17:04:39 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:44.153 17:04:39 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:44.153 ************************************ 00:10:44.153 START TEST skip_rpc_with_json 00:10:44.153 ************************************ 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=4068588 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 4068588 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 4068588 ']' 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:44.153 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:44.153 17:04:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:10:44.153 [2024-07-23 17:04:39.509569] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:10:44.153 [2024-07-23 17:04:39.509637] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4068588 ] 00:10:44.486 [2024-07-23 17:04:39.643202] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:44.486 [2024-07-23 17:04:39.697985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:10:45.054 [2024-07-23 17:04:40.439545] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:10:45.054 request: 00:10:45.054 { 00:10:45.054 "trtype": "tcp", 00:10:45.054 "method": "nvmf_get_transports", 00:10:45.054 "req_id": 1 00:10:45.054 } 00:10:45.054 Got JSON-RPC error response 00:10:45.054 response: 00:10:45.054 { 00:10:45.054 "code": -19, 00:10:45.054 "message": "No such device" 00:10:45.054 } 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:10:45.054 [2024-07-23 17:04:40.451698] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.054 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:10:45.314 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.314 17:04:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:10:45.314 { 00:10:45.314 "subsystems": [ 00:10:45.314 { 00:10:45.314 "subsystem": "keyring", 00:10:45.314 "config": [] 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "subsystem": "iobuf", 00:10:45.314 "config": [ 00:10:45.314 { 00:10:45.314 "method": "iobuf_set_options", 00:10:45.314 "params": { 00:10:45.314 "small_pool_count": 8192, 00:10:45.314 "large_pool_count": 1024, 00:10:45.314 "small_bufsize": 8192, 00:10:45.314 "large_bufsize": 135168 00:10:45.314 } 00:10:45.314 } 00:10:45.314 ] 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "subsystem": "sock", 00:10:45.314 "config": [ 00:10:45.314 { 00:10:45.314 "method": "sock_set_default_impl", 00:10:45.314 "params": { 00:10:45.314 "impl_name": "posix" 00:10:45.314 } 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "method": "sock_impl_set_options", 00:10:45.314 "params": { 00:10:45.314 "impl_name": "ssl", 00:10:45.314 "recv_buf_size": 4096, 00:10:45.314 "send_buf_size": 4096, 00:10:45.314 "enable_recv_pipe": true, 00:10:45.314 "enable_quickack": false, 00:10:45.314 "enable_placement_id": 0, 00:10:45.314 "enable_zerocopy_send_server": true, 00:10:45.314 "enable_zerocopy_send_client": false, 00:10:45.314 "zerocopy_threshold": 0, 00:10:45.314 "tls_version": 0, 00:10:45.314 "enable_ktls": false 00:10:45.314 } 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "method": "sock_impl_set_options", 00:10:45.314 "params": { 00:10:45.314 "impl_name": "posix", 00:10:45.314 "recv_buf_size": 2097152, 00:10:45.314 "send_buf_size": 2097152, 00:10:45.314 "enable_recv_pipe": true, 00:10:45.314 "enable_quickack": false, 00:10:45.314 "enable_placement_id": 0, 00:10:45.314 "enable_zerocopy_send_server": true, 00:10:45.314 "enable_zerocopy_send_client": false, 00:10:45.314 "zerocopy_threshold": 0, 00:10:45.314 "tls_version": 0, 00:10:45.314 "enable_ktls": false 00:10:45.314 } 00:10:45.314 } 00:10:45.314 ] 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "subsystem": "vmd", 00:10:45.314 "config": [] 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "subsystem": "accel", 00:10:45.314 "config": [ 00:10:45.314 { 00:10:45.314 "method": "accel_set_options", 00:10:45.314 "params": { 00:10:45.314 "small_cache_size": 128, 00:10:45.314 "large_cache_size": 16, 00:10:45.314 "task_count": 2048, 00:10:45.314 "sequence_count": 2048, 00:10:45.314 "buf_count": 2048 00:10:45.314 } 00:10:45.314 } 00:10:45.314 ] 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "subsystem": "bdev", 00:10:45.314 "config": [ 00:10:45.314 { 00:10:45.314 "method": "bdev_set_options", 00:10:45.314 "params": { 00:10:45.314 "bdev_io_pool_size": 65535, 00:10:45.314 "bdev_io_cache_size": 256, 00:10:45.314 "bdev_auto_examine": true, 00:10:45.314 "iobuf_small_cache_size": 128, 00:10:45.314 "iobuf_large_cache_size": 16 00:10:45.314 } 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "method": "bdev_raid_set_options", 00:10:45.314 "params": { 00:10:45.314 "process_window_size_kb": 1024, 00:10:45.314 "process_max_bandwidth_mb_sec": 0 00:10:45.314 } 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "method": "bdev_iscsi_set_options", 00:10:45.314 "params": { 00:10:45.314 "timeout_sec": 30 00:10:45.314 } 00:10:45.314 }, 00:10:45.314 { 00:10:45.314 "method": "bdev_nvme_set_options", 00:10:45.314 "params": { 00:10:45.314 "action_on_timeout": "none", 00:10:45.314 "timeout_us": 0, 00:10:45.314 "timeout_admin_us": 0, 00:10:45.314 "keep_alive_timeout_ms": 10000, 00:10:45.314 "arbitration_burst": 0, 00:10:45.314 "low_priority_weight": 0, 00:10:45.314 "medium_priority_weight": 0, 00:10:45.314 "high_priority_weight": 0, 00:10:45.315 "nvme_adminq_poll_period_us": 10000, 00:10:45.315 "nvme_ioq_poll_period_us": 0, 00:10:45.315 "io_queue_requests": 0, 00:10:45.315 "delay_cmd_submit": true, 00:10:45.315 "transport_retry_count": 4, 00:10:45.315 "bdev_retry_count": 3, 00:10:45.315 "transport_ack_timeout": 0, 00:10:45.315 "ctrlr_loss_timeout_sec": 0, 00:10:45.315 "reconnect_delay_sec": 0, 00:10:45.315 "fast_io_fail_timeout_sec": 0, 00:10:45.315 "disable_auto_failback": false, 00:10:45.315 "generate_uuids": false, 00:10:45.315 "transport_tos": 0, 00:10:45.315 "nvme_error_stat": false, 00:10:45.315 "rdma_srq_size": 0, 00:10:45.315 "io_path_stat": false, 00:10:45.315 "allow_accel_sequence": false, 00:10:45.315 "rdma_max_cq_size": 0, 00:10:45.315 "rdma_cm_event_timeout_ms": 0, 00:10:45.315 "dhchap_digests": [ 00:10:45.315 "sha256", 00:10:45.315 "sha384", 00:10:45.315 "sha512" 00:10:45.315 ], 00:10:45.315 "dhchap_dhgroups": [ 00:10:45.315 "null", 00:10:45.315 "ffdhe2048", 00:10:45.315 "ffdhe3072", 00:10:45.315 "ffdhe4096", 00:10:45.315 "ffdhe6144", 00:10:45.315 "ffdhe8192" 00:10:45.315 ] 00:10:45.315 } 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "method": "bdev_nvme_set_hotplug", 00:10:45.315 "params": { 00:10:45.315 "period_us": 100000, 00:10:45.315 "enable": false 00:10:45.315 } 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "method": "bdev_wait_for_examine" 00:10:45.315 } 00:10:45.315 ] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "scsi", 00:10:45.315 "config": null 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "scheduler", 00:10:45.315 "config": [ 00:10:45.315 { 00:10:45.315 "method": "framework_set_scheduler", 00:10:45.315 "params": { 00:10:45.315 "name": "static" 00:10:45.315 } 00:10:45.315 } 00:10:45.315 ] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "vhost_scsi", 00:10:45.315 "config": [] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "vhost_blk", 00:10:45.315 "config": [] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "ublk", 00:10:45.315 "config": [] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "nbd", 00:10:45.315 "config": [] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "nvmf", 00:10:45.315 "config": [ 00:10:45.315 { 00:10:45.315 "method": "nvmf_set_config", 00:10:45.315 "params": { 00:10:45.315 "discovery_filter": "match_any", 00:10:45.315 "admin_cmd_passthru": { 00:10:45.315 "identify_ctrlr": false 00:10:45.315 } 00:10:45.315 } 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "method": "nvmf_set_max_subsystems", 00:10:45.315 "params": { 00:10:45.315 "max_subsystems": 1024 00:10:45.315 } 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "method": "nvmf_set_crdt", 00:10:45.315 "params": { 00:10:45.315 "crdt1": 0, 00:10:45.315 "crdt2": 0, 00:10:45.315 "crdt3": 0 00:10:45.315 } 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "method": "nvmf_create_transport", 00:10:45.315 "params": { 00:10:45.315 "trtype": "TCP", 00:10:45.315 "max_queue_depth": 128, 00:10:45.315 "max_io_qpairs_per_ctrlr": 127, 00:10:45.315 "in_capsule_data_size": 4096, 00:10:45.315 "max_io_size": 131072, 00:10:45.315 "io_unit_size": 131072, 00:10:45.315 "max_aq_depth": 128, 00:10:45.315 "num_shared_buffers": 511, 00:10:45.315 "buf_cache_size": 4294967295, 00:10:45.315 "dif_insert_or_strip": false, 00:10:45.315 "zcopy": false, 00:10:45.315 "c2h_success": true, 00:10:45.315 "sock_priority": 0, 00:10:45.315 "abort_timeout_sec": 1, 00:10:45.315 "ack_timeout": 0, 00:10:45.315 "data_wr_pool_size": 0 00:10:45.315 } 00:10:45.315 } 00:10:45.315 ] 00:10:45.315 }, 00:10:45.315 { 00:10:45.315 "subsystem": "iscsi", 00:10:45.315 "config": [ 00:10:45.315 { 00:10:45.315 "method": "iscsi_set_options", 00:10:45.315 "params": { 00:10:45.315 "node_base": "iqn.2016-06.io.spdk", 00:10:45.315 "max_sessions": 128, 00:10:45.315 "max_connections_per_session": 2, 00:10:45.315 "max_queue_depth": 64, 00:10:45.315 "default_time2wait": 2, 00:10:45.315 "default_time2retain": 20, 00:10:45.315 "first_burst_length": 8192, 00:10:45.315 "immediate_data": true, 00:10:45.315 "allow_duplicated_isid": false, 00:10:45.315 "error_recovery_level": 0, 00:10:45.315 "nop_timeout": 60, 00:10:45.315 "nop_in_interval": 30, 00:10:45.315 "disable_chap": false, 00:10:45.315 "require_chap": false, 00:10:45.315 "mutual_chap": false, 00:10:45.315 "chap_group": 0, 00:10:45.315 "max_large_datain_per_connection": 64, 00:10:45.315 "max_r2t_per_connection": 4, 00:10:45.315 "pdu_pool_size": 36864, 00:10:45.315 "immediate_data_pool_size": 16384, 00:10:45.315 "data_out_pool_size": 2048 00:10:45.315 } 00:10:45.315 } 00:10:45.315 ] 00:10:45.315 } 00:10:45.315 ] 00:10:45.315 } 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 4068588 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 4068588 ']' 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 4068588 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4068588 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4068588' 00:10:45.315 killing process with pid 4068588 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 4068588 00:10:45.315 17:04:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 4068588 00:10:45.884 17:04:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=4068782 00:10:45.884 17:04:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:10:45.884 17:04:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 4068782 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 4068782 ']' 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 4068782 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4068782 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4068782' 00:10:51.153 killing process with pid 4068782 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 4068782 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 4068782 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:10:51.153 00:10:51.153 real 0m7.014s 00:10:51.153 user 0m6.642s 00:10:51.153 sys 0m0.934s 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:10:51.153 ************************************ 00:10:51.153 END TEST skip_rpc_with_json 00:10:51.153 ************************************ 00:10:51.153 17:04:46 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:51.153 17:04:46 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:10:51.153 17:04:46 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:51.153 17:04:46 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:51.153 17:04:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:51.153 ************************************ 00:10:51.153 START TEST skip_rpc_with_delay 00:10:51.153 ************************************ 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:10:51.153 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:10:51.412 [2024-07-23 17:04:46.615925] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:10:51.412 [2024-07-23 17:04:46.616030] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:10:51.412 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:10:51.412 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:51.412 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:51.412 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:51.412 00:10:51.412 real 0m0.099s 00:10:51.412 user 0m0.056s 00:10:51.412 sys 0m0.042s 00:10:51.412 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:51.412 17:04:46 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:10:51.412 ************************************ 00:10:51.412 END TEST skip_rpc_with_delay 00:10:51.412 ************************************ 00:10:51.412 17:04:46 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:51.412 17:04:46 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:10:51.412 17:04:46 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:10:51.412 17:04:46 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:10:51.412 17:04:46 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:51.412 17:04:46 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:51.412 17:04:46 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:51.412 ************************************ 00:10:51.412 START TEST exit_on_failed_rpc_init 00:10:51.412 ************************************ 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=4069606 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 4069606 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 4069606 ']' 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:51.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:51.412 17:04:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:10:51.671 [2024-07-23 17:04:46.851175] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:10:51.671 [2024-07-23 17:04:46.851313] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4069606 ] 00:10:51.671 [2024-07-23 17:04:47.058650] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:51.929 [2024-07-23 17:04:47.117387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:52.187 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:52.188 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:10:52.188 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:10:52.188 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:10:52.188 [2024-07-23 17:04:47.498770] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:10:52.188 [2024-07-23 17:04:47.498912] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4069710 ] 00:10:52.445 [2024-07-23 17:04:47.707667] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.445 [2024-07-23 17:04:47.765925] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:52.445 [2024-07-23 17:04:47.766022] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:10:52.445 [2024-07-23 17:04:47.766043] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:10:52.446 [2024-07-23 17:04:47.766059] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 4069606 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 4069606 ']' 00:10:52.446 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 4069606 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4069606 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4069606' 00:10:52.704 killing process with pid 4069606 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 4069606 00:10:52.704 17:04:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 4069606 00:10:52.962 00:10:52.962 real 0m1.531s 00:10:52.963 user 0m1.805s 00:10:52.963 sys 0m0.754s 00:10:52.963 17:04:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:52.963 17:04:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:10:52.963 ************************************ 00:10:52.963 END TEST exit_on_failed_rpc_init 00:10:52.963 ************************************ 00:10:52.963 17:04:48 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:52.963 17:04:48 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:10:52.963 00:10:52.963 real 0m14.523s 00:10:52.963 user 0m13.779s 00:10:52.963 sys 0m2.389s 00:10:52.963 17:04:48 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:52.963 17:04:48 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:52.963 ************************************ 00:10:52.963 END TEST skip_rpc 00:10:52.963 ************************************ 00:10:52.963 17:04:48 -- common/autotest_common.sh@1142 -- # return 0 00:10:52.963 17:04:48 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:10:52.963 17:04:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:52.963 17:04:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:52.963 17:04:48 -- common/autotest_common.sh@10 -- # set +x 00:10:53.222 ************************************ 00:10:53.222 START TEST rpc_client 00:10:53.222 ************************************ 00:10:53.222 17:04:48 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:10:53.222 * Looking for test storage... 00:10:53.222 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:10:53.222 17:04:48 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:10:53.222 OK 00:10:53.222 17:04:48 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:10:53.222 00:10:53.222 real 0m0.143s 00:10:53.222 user 0m0.065s 00:10:53.222 sys 0m0.088s 00:10:53.222 17:04:48 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:53.222 17:04:48 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:10:53.222 ************************************ 00:10:53.222 END TEST rpc_client 00:10:53.222 ************************************ 00:10:53.222 17:04:48 -- common/autotest_common.sh@1142 -- # return 0 00:10:53.222 17:04:48 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:10:53.222 17:04:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:53.222 17:04:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:53.222 17:04:48 -- common/autotest_common.sh@10 -- # set +x 00:10:53.222 ************************************ 00:10:53.222 START TEST json_config 00:10:53.222 ************************************ 00:10:53.222 17:04:48 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:10:53.481 17:04:48 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@7 -- # uname -s 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:10:53.481 17:04:48 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:10:53.482 17:04:48 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:10:53.482 17:04:48 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:53.482 17:04:48 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:53.482 17:04:48 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:53.482 17:04:48 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:53.482 17:04:48 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:53.482 17:04:48 json_config -- paths/export.sh@5 -- # export PATH 00:10:53.482 17:04:48 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@47 -- # : 0 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:10:53.482 17:04:48 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:10:53.482 INFO: JSON configuration test init 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:53.482 17:04:48 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:10:53.482 17:04:48 json_config -- json_config/common.sh@9 -- # local app=target 00:10:53.482 17:04:48 json_config -- json_config/common.sh@10 -- # shift 00:10:53.482 17:04:48 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:10:53.482 17:04:48 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:10:53.482 17:04:48 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:10:53.482 17:04:48 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:10:53.482 17:04:48 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:10:53.482 17:04:48 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=4069992 00:10:53.482 17:04:48 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:10:53.482 Waiting for target to run... 00:10:53.482 17:04:48 json_config -- json_config/common.sh@25 -- # waitforlisten 4069992 /var/tmp/spdk_tgt.sock 00:10:53.482 17:04:48 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@829 -- # '[' -z 4069992 ']' 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:10:53.482 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:53.482 17:04:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:53.482 [2024-07-23 17:04:48.870282] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:10:53.482 [2024-07-23 17:04:48.870418] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4069992 ] 00:10:54.051 [2024-07-23 17:04:49.454233] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:54.309 [2024-07-23 17:04:49.490944] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:54.876 17:04:49 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:54.876 17:04:49 json_config -- common/autotest_common.sh@862 -- # return 0 00:10:54.876 17:04:49 json_config -- json_config/common.sh@26 -- # echo '' 00:10:54.876 00:10:54.876 17:04:50 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:10:54.876 17:04:50 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:10:54.876 17:04:50 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:54.876 17:04:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:54.876 17:04:50 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:10:54.876 17:04:50 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:10:54.876 17:04:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:10:54.876 17:04:50 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:10:54.876 17:04:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:10:55.134 [2024-07-23 17:04:50.485929] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:10:55.134 17:04:50 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:10:55.134 17:04:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:10:55.393 [2024-07-23 17:04:50.734564] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:10:55.393 17:04:50 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:10:55.393 17:04:50 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:10:55.393 17:04:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:55.393 17:04:50 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:10:55.393 17:04:50 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:10:55.393 17:04:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:10:55.653 [2024-07-23 17:04:51.051992] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:10:58.943 17:04:53 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:58.943 17:04:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:10:58.943 17:04:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:10:58.943 17:04:53 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@48 -- # local get_types 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@51 -- # sort 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:10:58.943 17:04:54 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:10:58.943 17:04:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@59 -- # return 0 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:10:58.943 17:04:54 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:10:58.943 17:04:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:10:58.943 17:04:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:10:58.943 17:04:54 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:10:58.943 17:04:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:10:59.202 Nvme0n1p0 Nvme0n1p1 00:10:59.202 17:04:54 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:10:59.202 17:04:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:10:59.461 [2024-07-23 17:04:54.833729] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:10:59.461 [2024-07-23 17:04:54.833783] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:10:59.461 00:10:59.461 17:04:54 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:10:59.461 17:04:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:10:59.720 Malloc3 00:10:59.720 17:04:55 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:10:59.720 17:04:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:10:59.980 [2024-07-23 17:04:55.327160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:59.980 [2024-07-23 17:04:55.327205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.980 [2024-07-23 17:04:55.327230] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2023840 00:10:59.980 [2024-07-23 17:04:55.327243] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.980 [2024-07-23 17:04:55.328813] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.980 [2024-07-23 17:04:55.328842] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:10:59.980 PTBdevFromMalloc3 00:10:59.980 17:04:55 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:10:59.980 17:04:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:11:00.239 Null0 00:11:00.239 17:04:55 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:11:00.239 17:04:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:11:00.501 Malloc0 00:11:00.501 17:04:55 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:11:00.501 17:04:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:11:00.760 Malloc1 00:11:00.760 17:04:56 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:11:00.760 17:04:56 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:11:01.019 102400+0 records in 00:11:01.019 102400+0 records out 00:11:01.019 104857600 bytes (105 MB, 100 MiB) copied, 0.295941 s, 354 MB/s 00:11:01.019 17:04:56 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:11:01.019 17:04:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:11:01.279 aio_disk 00:11:01.279 17:04:56 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:11:01.279 17:04:56 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:11:01.279 17:04:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:11:06.553 5d545b77-0898-4981-a8b6-65348064f0a9 00:11:06.553 17:05:01 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:11:06.553 17:05:01 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:11:06.553 17:05:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:11:06.553 17:05:01 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:11:06.553 17:05:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:11:06.553 17:05:01 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:11:06.553 17:05:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:11:06.812 17:05:01 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:11:06.812 17:05:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:11:06.812 17:05:02 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:11:06.812 17:05:02 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:11:06.812 17:05:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:11:07.071 MallocForCryptoBdev 00:11:07.071 17:05:02 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:11:07.071 17:05:02 json_config -- json_config/json_config.sh@163 -- # wc -l 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:11:07.330 17:05:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:11:07.330 [2024-07-23 17:05:02.726316] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:11:07.330 CryptoMallocBdev 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:57142b98-e958-4def-ab11-6a10a3a86105 bdev_register:ab8dd994-6ec2-4a90-919c-b87019068795 bdev_register:c9ced1b1-7be6-49cc-92f1-99bdd9750d18 bdev_register:ce791df9-004e-488d-9103-f3f93f23ee28 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:11:07.330 17:05:02 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@75 -- # sort 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:57142b98-e958-4def-ab11-6a10a3a86105 bdev_register:ab8dd994-6ec2-4a90-919c-b87019068795 bdev_register:c9ced1b1-7be6-49cc-92f1-99bdd9750d18 bdev_register:ce791df9-004e-488d-9103-f3f93f23ee28 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@76 -- # sort 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.589 17:05:02 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.590 17:05:02 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:11:07.590 17:05:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:11:07.590 17:05:02 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:57142b98-e958-4def-ab11-6a10a3a86105 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:ab8dd994-6ec2-4a90-919c-b87019068795 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:c9ced1b1-7be6-49cc-92f1-99bdd9750d18 00:11:07.849 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:ce791df9-004e-488d-9103-f3f93f23ee28 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:57142b98-e958-4def-ab11-6a10a3a86105 bdev_register:ab8dd994-6ec2-4a90-919c-b87019068795 bdev_register:aio_disk bdev_register:c9ced1b1-7be6-49cc-92f1-99bdd9750d18 bdev_register:ce791df9-004e-488d-9103-f3f93f23ee28 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\5\7\1\4\2\b\9\8\-\e\9\5\8\-\4\d\e\f\-\a\b\1\1\-\6\a\1\0\a\3\a\8\6\1\0\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\b\8\d\d\9\9\4\-\6\e\c\2\-\4\a\9\0\-\9\1\9\c\-\b\8\7\0\1\9\0\6\8\7\9\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\c\9\c\e\d\1\b\1\-\7\b\e\6\-\4\9\c\c\-\9\2\f\1\-\9\9\b\d\d\9\7\5\0\d\1\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\c\e\7\9\1\d\f\9\-\0\0\4\e\-\4\8\8\d\-\9\1\0\3\-\f\3\f\9\3\f\2\3\e\e\2\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@90 -- # cat 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:57142b98-e958-4def-ab11-6a10a3a86105 bdev_register:ab8dd994-6ec2-4a90-919c-b87019068795 bdev_register:aio_disk bdev_register:c9ced1b1-7be6-49cc-92f1-99bdd9750d18 bdev_register:ce791df9-004e-488d-9103-f3f93f23ee28 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:11:07.850 Expected events matched: 00:11:07.850 bdev_register:57142b98-e958-4def-ab11-6a10a3a86105 00:11:07.850 bdev_register:ab8dd994-6ec2-4a90-919c-b87019068795 00:11:07.850 bdev_register:aio_disk 00:11:07.850 bdev_register:c9ced1b1-7be6-49cc-92f1-99bdd9750d18 00:11:07.850 bdev_register:ce791df9-004e-488d-9103-f3f93f23ee28 00:11:07.850 bdev_register:CryptoMallocBdev 00:11:07.850 bdev_register:Malloc0 00:11:07.850 bdev_register:Malloc0p0 00:11:07.850 bdev_register:Malloc0p1 00:11:07.850 bdev_register:Malloc0p2 00:11:07.850 bdev_register:Malloc1 00:11:07.850 bdev_register:Malloc3 00:11:07.850 bdev_register:MallocForCryptoBdev 00:11:07.850 bdev_register:Null0 00:11:07.850 bdev_register:Nvme0n1 00:11:07.850 bdev_register:Nvme0n1p0 00:11:07.850 bdev_register:Nvme0n1p1 00:11:07.850 bdev_register:PTBdevFromMalloc3 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:11:07.850 17:05:03 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:11:07.850 17:05:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:11:07.850 17:05:03 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:11:07.850 17:05:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:11:07.850 17:05:03 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:11:07.850 17:05:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:11:08.128 MallocBdevForConfigChangeCheck 00:11:08.128 17:05:03 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:11:08.128 17:05:03 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:11:08.128 17:05:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:08.128 17:05:03 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:11:08.128 17:05:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:11:08.414 17:05:03 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:11:08.414 INFO: shutting down applications... 00:11:08.414 17:05:03 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:11:08.414 17:05:03 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:11:08.414 17:05:03 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:11:08.414 17:05:03 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:11:08.672 [2024-07-23 17:05:03.893998] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:11:11.961 Calling clear_iscsi_subsystem 00:11:11.961 Calling clear_nvmf_subsystem 00:11:11.961 Calling clear_nbd_subsystem 00:11:11.961 Calling clear_ublk_subsystem 00:11:11.961 Calling clear_vhost_blk_subsystem 00:11:11.961 Calling clear_vhost_scsi_subsystem 00:11:11.961 Calling clear_bdev_subsystem 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@347 -- # count=100 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@349 -- # break 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:11:11.961 17:05:06 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:11:11.961 17:05:06 json_config -- json_config/common.sh@31 -- # local app=target 00:11:11.961 17:05:06 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:11:11.961 17:05:06 json_config -- json_config/common.sh@35 -- # [[ -n 4069992 ]] 00:11:11.961 17:05:06 json_config -- json_config/common.sh@38 -- # kill -SIGINT 4069992 00:11:11.961 17:05:06 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:11:11.961 17:05:06 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:11:11.961 17:05:06 json_config -- json_config/common.sh@41 -- # kill -0 4069992 00:11:11.961 17:05:06 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:11:12.221 17:05:07 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:11:12.221 17:05:07 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:11:12.221 17:05:07 json_config -- json_config/common.sh@41 -- # kill -0 4069992 00:11:12.221 17:05:07 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:11:12.221 17:05:07 json_config -- json_config/common.sh@43 -- # break 00:11:12.221 17:05:07 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:11:12.221 17:05:07 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:11:12.221 SPDK target shutdown done 00:11:12.221 17:05:07 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:11:12.221 INFO: relaunching applications... 00:11:12.221 17:05:07 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:12.221 17:05:07 json_config -- json_config/common.sh@9 -- # local app=target 00:11:12.221 17:05:07 json_config -- json_config/common.sh@10 -- # shift 00:11:12.221 17:05:07 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:11:12.221 17:05:07 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:11:12.221 17:05:07 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:11:12.221 17:05:07 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:11:12.221 17:05:07 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:11:12.221 17:05:07 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=4072607 00:11:12.221 17:05:07 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:11:12.221 Waiting for target to run... 00:11:12.221 17:05:07 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:12.221 17:05:07 json_config -- json_config/common.sh@25 -- # waitforlisten 4072607 /var/tmp/spdk_tgt.sock 00:11:12.221 17:05:07 json_config -- common/autotest_common.sh@829 -- # '[' -z 4072607 ']' 00:11:12.221 17:05:07 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:11:12.221 17:05:07 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:12.221 17:05:07 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:11:12.221 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:11:12.221 17:05:07 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:12.221 17:05:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:12.221 [2024-07-23 17:05:07.566068] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:12.221 [2024-07-23 17:05:07.566147] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4072607 ] 00:11:13.159 [2024-07-23 17:05:08.222333] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:13.159 [2024-07-23 17:05:08.263170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:13.159 [2024-07-23 17:05:08.317421] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:11:13.159 [2024-07-23 17:05:08.325456] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:11:13.159 [2024-07-23 17:05:08.333474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:11:13.159 [2024-07-23 17:05:08.414769] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:11:15.694 [2024-07-23 17:05:10.771715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:15.695 [2024-07-23 17:05:10.771781] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:15.695 [2024-07-23 17:05:10.771796] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:15.695 [2024-07-23 17:05:10.779728] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:11:15.695 [2024-07-23 17:05:10.779754] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:11:15.695 [2024-07-23 17:05:10.787744] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:11:15.695 [2024-07-23 17:05:10.787769] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:11:15.695 [2024-07-23 17:05:10.795779] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:11:15.695 [2024-07-23 17:05:10.795806] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:11:15.695 [2024-07-23 17:05:10.795819] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:15.954 [2024-07-23 17:05:11.168815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:15.954 [2024-07-23 17:05:11.168864] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:15.954 [2024-07-23 17:05:11.168883] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x3096890 00:11:15.954 [2024-07-23 17:05:11.168908] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:15.954 [2024-07-23 17:05:11.169200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:15.954 [2024-07-23 17:05:11.169220] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:11:16.891 17:05:12 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:16.891 17:05:12 json_config -- common/autotest_common.sh@862 -- # return 0 00:11:16.891 17:05:12 json_config -- json_config/common.sh@26 -- # echo '' 00:11:16.891 00:11:16.891 17:05:12 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:11:16.891 17:05:12 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:11:16.891 INFO: Checking if target configuration is the same... 00:11:16.891 17:05:12 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:16.891 17:05:12 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:11:16.891 17:05:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:11:16.891 + '[' 2 -ne 2 ']' 00:11:16.891 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:11:16.891 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:11:16.891 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:11:16.891 +++ basename /dev/fd/62 00:11:16.891 ++ mktemp /tmp/62.XXX 00:11:16.891 + tmp_file_1=/tmp/62.OA6 00:11:16.891 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:16.891 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:11:16.891 + tmp_file_2=/tmp/spdk_tgt_config.json.Nqm 00:11:16.891 + ret=0 00:11:16.891 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:11:17.459 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:11:17.459 + diff -u /tmp/62.OA6 /tmp/spdk_tgt_config.json.Nqm 00:11:17.459 + echo 'INFO: JSON config files are the same' 00:11:17.459 INFO: JSON config files are the same 00:11:17.459 + rm /tmp/62.OA6 /tmp/spdk_tgt_config.json.Nqm 00:11:17.459 + exit 0 00:11:17.459 17:05:12 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:11:17.459 17:05:12 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:11:17.459 INFO: changing configuration and checking if this can be detected... 00:11:17.459 17:05:12 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:11:17.459 17:05:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:11:17.718 17:05:12 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:11:17.718 17:05:12 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:17.718 17:05:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:11:17.718 + '[' 2 -ne 2 ']' 00:11:17.718 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:11:17.718 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:11:17.718 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:11:17.718 +++ basename /dev/fd/62 00:11:17.718 ++ mktemp /tmp/62.XXX 00:11:17.718 + tmp_file_1=/tmp/62.7q0 00:11:17.718 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:17.718 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:11:17.718 + tmp_file_2=/tmp/spdk_tgt_config.json.PbX 00:11:17.718 + ret=0 00:11:17.718 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:11:18.285 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:11:18.285 + diff -u /tmp/62.7q0 /tmp/spdk_tgt_config.json.PbX 00:11:18.285 + ret=1 00:11:18.285 + echo '=== Start of file: /tmp/62.7q0 ===' 00:11:18.285 + cat /tmp/62.7q0 00:11:18.285 + echo '=== End of file: /tmp/62.7q0 ===' 00:11:18.285 + echo '' 00:11:18.285 + echo '=== Start of file: /tmp/spdk_tgt_config.json.PbX ===' 00:11:18.285 + cat /tmp/spdk_tgt_config.json.PbX 00:11:18.285 + echo '=== End of file: /tmp/spdk_tgt_config.json.PbX ===' 00:11:18.285 + echo '' 00:11:18.285 + rm /tmp/62.7q0 /tmp/spdk_tgt_config.json.PbX 00:11:18.285 + exit 1 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:11:18.285 INFO: configuration change detected. 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:11:18.285 17:05:13 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:11:18.285 17:05:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@321 -- # [[ -n 4072607 ]] 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:11:18.285 17:05:13 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:11:18.544 17:05:13 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:11:18.544 17:05:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:18.544 17:05:13 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:11:18.544 17:05:13 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:11:18.544 17:05:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:11:18.544 17:05:13 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:11:18.544 17:05:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:11:18.802 17:05:14 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:11:18.802 17:05:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:11:19.060 17:05:14 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:11:19.060 17:05:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:11:19.319 17:05:14 json_config -- json_config/json_config.sh@197 -- # uname -s 00:11:19.319 17:05:14 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:11:19.319 17:05:14 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:11:19.319 17:05:14 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:11:19.319 17:05:14 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:11:19.319 17:05:14 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:11:19.319 17:05:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:19.577 17:05:14 json_config -- json_config/json_config.sh@327 -- # killprocess 4072607 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@948 -- # '[' -z 4072607 ']' 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@952 -- # kill -0 4072607 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@953 -- # uname 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4072607 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4072607' 00:11:19.577 killing process with pid 4072607 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@967 -- # kill 4072607 00:11:19.577 17:05:14 json_config -- common/autotest_common.sh@972 -- # wait 4072607 00:11:22.862 17:05:17 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:11:22.862 17:05:18 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:11:22.862 17:05:18 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:11:22.862 17:05:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:22.862 17:05:18 json_config -- json_config/json_config.sh@332 -- # return 0 00:11:22.862 17:05:18 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:11:22.862 INFO: Success 00:11:22.862 00:11:22.862 real 0m29.435s 00:11:22.862 user 0m35.568s 00:11:22.862 sys 0m4.676s 00:11:22.862 17:05:18 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:22.862 17:05:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:11:22.862 ************************************ 00:11:22.862 END TEST json_config 00:11:22.862 ************************************ 00:11:22.862 17:05:18 -- common/autotest_common.sh@1142 -- # return 0 00:11:22.862 17:05:18 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:11:22.862 17:05:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:22.862 17:05:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:22.862 17:05:18 -- common/autotest_common.sh@10 -- # set +x 00:11:22.862 ************************************ 00:11:22.862 START TEST json_config_extra_key 00:11:22.862 ************************************ 00:11:22.862 17:05:18 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:11:22.862 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:11:22.862 17:05:18 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:11:22.862 17:05:18 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:22.862 17:05:18 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:22.863 17:05:18 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:22.863 17:05:18 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:22.863 17:05:18 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:22.863 17:05:18 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:22.863 17:05:18 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:11:22.863 17:05:18 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:11:22.863 17:05:18 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:11:22.863 INFO: launching applications... 00:11:22.863 17:05:18 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=4074125 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:11:22.863 Waiting for target to run... 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 4074125 /var/tmp/spdk_tgt.sock 00:11:22.863 17:05:18 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 4074125 ']' 00:11:22.863 17:05:18 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:11:22.863 17:05:18 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:11:22.863 17:05:18 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:22.863 17:05:18 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:11:22.863 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:11:22.863 17:05:18 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:22.863 17:05:18 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:11:23.122 [2024-07-23 17:05:18.324925] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:23.122 [2024-07-23 17:05:18.325001] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4074125 ] 00:11:23.689 [2024-07-23 17:05:18.916265] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:23.689 [2024-07-23 17:05:18.956344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:23.948 17:05:19 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:23.948 17:05:19 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:11:23.948 17:05:19 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:11:23.948 00:11:23.948 17:05:19 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:11:23.948 INFO: shutting down applications... 00:11:23.948 17:05:19 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:11:23.948 17:05:19 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:11:23.948 17:05:19 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:11:23.948 17:05:19 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 4074125 ]] 00:11:23.948 17:05:19 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 4074125 00:11:23.948 17:05:19 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:11:23.949 17:05:19 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:11:23.949 17:05:19 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 4074125 00:11:23.949 17:05:19 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 4074125 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@43 -- # break 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:11:24.518 17:05:19 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:11:24.518 SPDK target shutdown done 00:11:24.518 17:05:19 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:11:24.518 Success 00:11:24.518 00:11:24.518 real 0m1.621s 00:11:24.518 user 0m1.055s 00:11:24.518 sys 0m0.752s 00:11:24.518 17:05:19 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:24.518 17:05:19 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:11:24.518 ************************************ 00:11:24.518 END TEST json_config_extra_key 00:11:24.518 ************************************ 00:11:24.518 17:05:19 -- common/autotest_common.sh@1142 -- # return 0 00:11:24.518 17:05:19 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:11:24.518 17:05:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:24.518 17:05:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:24.518 17:05:19 -- common/autotest_common.sh@10 -- # set +x 00:11:24.518 ************************************ 00:11:24.518 START TEST alias_rpc 00:11:24.518 ************************************ 00:11:24.518 17:05:19 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:11:24.518 * Looking for test storage... 00:11:24.777 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:11:24.777 17:05:19 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:11:24.777 17:05:19 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=4074359 00:11:24.777 17:05:19 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:11:24.777 17:05:19 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 4074359 00:11:24.777 17:05:19 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 4074359 ']' 00:11:24.777 17:05:19 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:24.777 17:05:19 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:24.777 17:05:19 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:24.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:24.777 17:05:19 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:24.777 17:05:19 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:24.777 [2024-07-23 17:05:20.016395] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:24.777 [2024-07-23 17:05:20.016469] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4074359 ] 00:11:24.777 [2024-07-23 17:05:20.149966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:25.036 [2024-07-23 17:05:20.200053] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.604 17:05:20 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:25.605 17:05:20 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:11:25.605 17:05:20 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:11:25.864 17:05:21 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 4074359 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 4074359 ']' 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 4074359 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4074359 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4074359' 00:11:25.864 killing process with pid 4074359 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@967 -- # kill 4074359 00:11:25.864 17:05:21 alias_rpc -- common/autotest_common.sh@972 -- # wait 4074359 00:11:26.431 00:11:26.431 real 0m1.763s 00:11:26.431 user 0m1.964s 00:11:26.431 sys 0m0.552s 00:11:26.431 17:05:21 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:26.431 17:05:21 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:26.432 ************************************ 00:11:26.432 END TEST alias_rpc 00:11:26.432 ************************************ 00:11:26.432 17:05:21 -- common/autotest_common.sh@1142 -- # return 0 00:11:26.432 17:05:21 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:11:26.432 17:05:21 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:11:26.432 17:05:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:26.432 17:05:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:26.432 17:05:21 -- common/autotest_common.sh@10 -- # set +x 00:11:26.432 ************************************ 00:11:26.432 START TEST spdkcli_tcp 00:11:26.432 ************************************ 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:11:26.432 * Looking for test storage... 00:11:26.432 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=4074667 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 4074667 00:11:26.432 17:05:21 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 4074667 ']' 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:26.432 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:26.432 17:05:21 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:11:26.690 [2024-07-23 17:05:21.880219] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:26.690 [2024-07-23 17:05:21.880297] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4074667 ] 00:11:26.690 [2024-07-23 17:05:22.015307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:26.690 [2024-07-23 17:05:22.071771] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:26.690 [2024-07-23 17:05:22.071776] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:27.625 17:05:22 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:27.625 17:05:22 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:11:27.625 17:05:22 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=4074772 00:11:27.625 17:05:22 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:11:27.625 17:05:22 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:11:27.883 [ 00:11:27.883 "bdev_malloc_delete", 00:11:27.883 "bdev_malloc_create", 00:11:27.883 "bdev_null_resize", 00:11:27.883 "bdev_null_delete", 00:11:27.884 "bdev_null_create", 00:11:27.884 "bdev_nvme_cuse_unregister", 00:11:27.884 "bdev_nvme_cuse_register", 00:11:27.884 "bdev_opal_new_user", 00:11:27.884 "bdev_opal_set_lock_state", 00:11:27.884 "bdev_opal_delete", 00:11:27.884 "bdev_opal_get_info", 00:11:27.884 "bdev_opal_create", 00:11:27.884 "bdev_nvme_opal_revert", 00:11:27.884 "bdev_nvme_opal_init", 00:11:27.884 "bdev_nvme_send_cmd", 00:11:27.884 "bdev_nvme_get_path_iostat", 00:11:27.884 "bdev_nvme_get_mdns_discovery_info", 00:11:27.884 "bdev_nvme_stop_mdns_discovery", 00:11:27.884 "bdev_nvme_start_mdns_discovery", 00:11:27.884 "bdev_nvme_set_multipath_policy", 00:11:27.884 "bdev_nvme_set_preferred_path", 00:11:27.884 "bdev_nvme_get_io_paths", 00:11:27.884 "bdev_nvme_remove_error_injection", 00:11:27.884 "bdev_nvme_add_error_injection", 00:11:27.884 "bdev_nvme_get_discovery_info", 00:11:27.884 "bdev_nvme_stop_discovery", 00:11:27.884 "bdev_nvme_start_discovery", 00:11:27.884 "bdev_nvme_get_controller_health_info", 00:11:27.884 "bdev_nvme_disable_controller", 00:11:27.884 "bdev_nvme_enable_controller", 00:11:27.884 "bdev_nvme_reset_controller", 00:11:27.884 "bdev_nvme_get_transport_statistics", 00:11:27.884 "bdev_nvme_apply_firmware", 00:11:27.884 "bdev_nvme_detach_controller", 00:11:27.884 "bdev_nvme_get_controllers", 00:11:27.884 "bdev_nvme_attach_controller", 00:11:27.884 "bdev_nvme_set_hotplug", 00:11:27.884 "bdev_nvme_set_options", 00:11:27.884 "bdev_passthru_delete", 00:11:27.884 "bdev_passthru_create", 00:11:27.884 "bdev_lvol_set_parent_bdev", 00:11:27.884 "bdev_lvol_set_parent", 00:11:27.884 "bdev_lvol_check_shallow_copy", 00:11:27.884 "bdev_lvol_start_shallow_copy", 00:11:27.884 "bdev_lvol_grow_lvstore", 00:11:27.884 "bdev_lvol_get_lvols", 00:11:27.884 "bdev_lvol_get_lvstores", 00:11:27.884 "bdev_lvol_delete", 00:11:27.884 "bdev_lvol_set_read_only", 00:11:27.884 "bdev_lvol_resize", 00:11:27.884 "bdev_lvol_decouple_parent", 00:11:27.884 "bdev_lvol_inflate", 00:11:27.884 "bdev_lvol_rename", 00:11:27.884 "bdev_lvol_clone_bdev", 00:11:27.884 "bdev_lvol_clone", 00:11:27.884 "bdev_lvol_snapshot", 00:11:27.884 "bdev_lvol_create", 00:11:27.884 "bdev_lvol_delete_lvstore", 00:11:27.884 "bdev_lvol_rename_lvstore", 00:11:27.884 "bdev_lvol_create_lvstore", 00:11:27.884 "bdev_raid_set_options", 00:11:27.884 "bdev_raid_remove_base_bdev", 00:11:27.884 "bdev_raid_add_base_bdev", 00:11:27.884 "bdev_raid_delete", 00:11:27.884 "bdev_raid_create", 00:11:27.884 "bdev_raid_get_bdevs", 00:11:27.884 "bdev_error_inject_error", 00:11:27.884 "bdev_error_delete", 00:11:27.884 "bdev_error_create", 00:11:27.884 "bdev_split_delete", 00:11:27.884 "bdev_split_create", 00:11:27.884 "bdev_delay_delete", 00:11:27.884 "bdev_delay_create", 00:11:27.884 "bdev_delay_update_latency", 00:11:27.884 "bdev_zone_block_delete", 00:11:27.884 "bdev_zone_block_create", 00:11:27.884 "blobfs_create", 00:11:27.884 "blobfs_detect", 00:11:27.884 "blobfs_set_cache_size", 00:11:27.884 "bdev_crypto_delete", 00:11:27.884 "bdev_crypto_create", 00:11:27.884 "bdev_compress_delete", 00:11:27.884 "bdev_compress_create", 00:11:27.884 "bdev_compress_get_orphans", 00:11:27.884 "bdev_aio_delete", 00:11:27.884 "bdev_aio_rescan", 00:11:27.884 "bdev_aio_create", 00:11:27.884 "bdev_ftl_set_property", 00:11:27.884 "bdev_ftl_get_properties", 00:11:27.884 "bdev_ftl_get_stats", 00:11:27.884 "bdev_ftl_unmap", 00:11:27.884 "bdev_ftl_unload", 00:11:27.884 "bdev_ftl_delete", 00:11:27.884 "bdev_ftl_load", 00:11:27.884 "bdev_ftl_create", 00:11:27.884 "bdev_virtio_attach_controller", 00:11:27.884 "bdev_virtio_scsi_get_devices", 00:11:27.884 "bdev_virtio_detach_controller", 00:11:27.884 "bdev_virtio_blk_set_hotplug", 00:11:27.884 "bdev_iscsi_delete", 00:11:27.884 "bdev_iscsi_create", 00:11:27.884 "bdev_iscsi_set_options", 00:11:27.884 "accel_error_inject_error", 00:11:27.884 "ioat_scan_accel_module", 00:11:27.884 "dsa_scan_accel_module", 00:11:27.884 "iaa_scan_accel_module", 00:11:27.884 "dpdk_cryptodev_get_driver", 00:11:27.884 "dpdk_cryptodev_set_driver", 00:11:27.884 "dpdk_cryptodev_scan_accel_module", 00:11:27.884 "compressdev_scan_accel_module", 00:11:27.884 "keyring_file_remove_key", 00:11:27.884 "keyring_file_add_key", 00:11:27.884 "keyring_linux_set_options", 00:11:27.884 "iscsi_get_histogram", 00:11:27.884 "iscsi_enable_histogram", 00:11:27.884 "iscsi_set_options", 00:11:27.884 "iscsi_get_auth_groups", 00:11:27.884 "iscsi_auth_group_remove_secret", 00:11:27.884 "iscsi_auth_group_add_secret", 00:11:27.884 "iscsi_delete_auth_group", 00:11:27.884 "iscsi_create_auth_group", 00:11:27.884 "iscsi_set_discovery_auth", 00:11:27.884 "iscsi_get_options", 00:11:27.884 "iscsi_target_node_request_logout", 00:11:27.884 "iscsi_target_node_set_redirect", 00:11:27.884 "iscsi_target_node_set_auth", 00:11:27.884 "iscsi_target_node_add_lun", 00:11:27.884 "iscsi_get_stats", 00:11:27.884 "iscsi_get_connections", 00:11:27.884 "iscsi_portal_group_set_auth", 00:11:27.884 "iscsi_start_portal_group", 00:11:27.884 "iscsi_delete_portal_group", 00:11:27.884 "iscsi_create_portal_group", 00:11:27.884 "iscsi_get_portal_groups", 00:11:27.884 "iscsi_delete_target_node", 00:11:27.884 "iscsi_target_node_remove_pg_ig_maps", 00:11:27.884 "iscsi_target_node_add_pg_ig_maps", 00:11:27.884 "iscsi_create_target_node", 00:11:27.884 "iscsi_get_target_nodes", 00:11:27.884 "iscsi_delete_initiator_group", 00:11:27.884 "iscsi_initiator_group_remove_initiators", 00:11:27.884 "iscsi_initiator_group_add_initiators", 00:11:27.884 "iscsi_create_initiator_group", 00:11:27.884 "iscsi_get_initiator_groups", 00:11:27.884 "nvmf_set_crdt", 00:11:27.884 "nvmf_set_config", 00:11:27.884 "nvmf_set_max_subsystems", 00:11:27.884 "nvmf_stop_mdns_prr", 00:11:27.884 "nvmf_publish_mdns_prr", 00:11:27.884 "nvmf_subsystem_get_listeners", 00:11:27.884 "nvmf_subsystem_get_qpairs", 00:11:27.884 "nvmf_subsystem_get_controllers", 00:11:27.884 "nvmf_get_stats", 00:11:27.884 "nvmf_get_transports", 00:11:27.884 "nvmf_create_transport", 00:11:27.884 "nvmf_get_targets", 00:11:27.884 "nvmf_delete_target", 00:11:27.884 "nvmf_create_target", 00:11:27.884 "nvmf_subsystem_allow_any_host", 00:11:27.884 "nvmf_subsystem_remove_host", 00:11:27.884 "nvmf_subsystem_add_host", 00:11:27.884 "nvmf_ns_remove_host", 00:11:27.884 "nvmf_ns_add_host", 00:11:27.884 "nvmf_subsystem_remove_ns", 00:11:27.884 "nvmf_subsystem_add_ns", 00:11:27.884 "nvmf_subsystem_listener_set_ana_state", 00:11:27.884 "nvmf_discovery_get_referrals", 00:11:27.884 "nvmf_discovery_remove_referral", 00:11:27.884 "nvmf_discovery_add_referral", 00:11:27.884 "nvmf_subsystem_remove_listener", 00:11:27.884 "nvmf_subsystem_add_listener", 00:11:27.884 "nvmf_delete_subsystem", 00:11:27.884 "nvmf_create_subsystem", 00:11:27.884 "nvmf_get_subsystems", 00:11:27.884 "env_dpdk_get_mem_stats", 00:11:27.884 "nbd_get_disks", 00:11:27.884 "nbd_stop_disk", 00:11:27.884 "nbd_start_disk", 00:11:27.884 "ublk_recover_disk", 00:11:27.884 "ublk_get_disks", 00:11:27.884 "ublk_stop_disk", 00:11:27.884 "ublk_start_disk", 00:11:27.884 "ublk_destroy_target", 00:11:27.884 "ublk_create_target", 00:11:27.884 "virtio_blk_create_transport", 00:11:27.884 "virtio_blk_get_transports", 00:11:27.884 "vhost_controller_set_coalescing", 00:11:27.884 "vhost_get_controllers", 00:11:27.884 "vhost_delete_controller", 00:11:27.884 "vhost_create_blk_controller", 00:11:27.884 "vhost_scsi_controller_remove_target", 00:11:27.884 "vhost_scsi_controller_add_target", 00:11:27.884 "vhost_start_scsi_controller", 00:11:27.884 "vhost_create_scsi_controller", 00:11:27.884 "thread_set_cpumask", 00:11:27.884 "framework_get_governor", 00:11:27.884 "framework_get_scheduler", 00:11:27.884 "framework_set_scheduler", 00:11:27.884 "framework_get_reactors", 00:11:27.884 "thread_get_io_channels", 00:11:27.884 "thread_get_pollers", 00:11:27.884 "thread_get_stats", 00:11:27.884 "framework_monitor_context_switch", 00:11:27.884 "spdk_kill_instance", 00:11:27.884 "log_enable_timestamps", 00:11:27.884 "log_get_flags", 00:11:27.884 "log_clear_flag", 00:11:27.884 "log_set_flag", 00:11:27.884 "log_get_level", 00:11:27.884 "log_set_level", 00:11:27.884 "log_get_print_level", 00:11:27.884 "log_set_print_level", 00:11:27.884 "framework_enable_cpumask_locks", 00:11:27.884 "framework_disable_cpumask_locks", 00:11:27.884 "framework_wait_init", 00:11:27.884 "framework_start_init", 00:11:27.884 "scsi_get_devices", 00:11:27.884 "bdev_get_histogram", 00:11:27.884 "bdev_enable_histogram", 00:11:27.884 "bdev_set_qos_limit", 00:11:27.884 "bdev_set_qd_sampling_period", 00:11:27.884 "bdev_get_bdevs", 00:11:27.884 "bdev_reset_iostat", 00:11:27.884 "bdev_get_iostat", 00:11:27.884 "bdev_examine", 00:11:27.884 "bdev_wait_for_examine", 00:11:27.884 "bdev_set_options", 00:11:27.884 "notify_get_notifications", 00:11:27.884 "notify_get_types", 00:11:27.884 "accel_get_stats", 00:11:27.884 "accel_set_options", 00:11:27.884 "accel_set_driver", 00:11:27.884 "accel_crypto_key_destroy", 00:11:27.885 "accel_crypto_keys_get", 00:11:27.885 "accel_crypto_key_create", 00:11:27.885 "accel_assign_opc", 00:11:27.885 "accel_get_module_info", 00:11:27.885 "accel_get_opc_assignments", 00:11:27.885 "vmd_rescan", 00:11:27.885 "vmd_remove_device", 00:11:27.885 "vmd_enable", 00:11:27.885 "sock_get_default_impl", 00:11:27.885 "sock_set_default_impl", 00:11:27.885 "sock_impl_set_options", 00:11:27.885 "sock_impl_get_options", 00:11:27.885 "iobuf_get_stats", 00:11:27.885 "iobuf_set_options", 00:11:27.885 "framework_get_pci_devices", 00:11:27.885 "framework_get_config", 00:11:27.885 "framework_get_subsystems", 00:11:27.885 "trace_get_info", 00:11:27.885 "trace_get_tpoint_group_mask", 00:11:27.885 "trace_disable_tpoint_group", 00:11:27.885 "trace_enable_tpoint_group", 00:11:27.885 "trace_clear_tpoint_mask", 00:11:27.885 "trace_set_tpoint_mask", 00:11:27.885 "keyring_get_keys", 00:11:27.885 "spdk_get_version", 00:11:27.885 "rpc_get_methods" 00:11:27.885 ] 00:11:28.147 17:05:23 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:11:28.147 17:05:23 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:11:28.147 17:05:23 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 4074667 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 4074667 ']' 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 4074667 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4074667 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4074667' 00:11:28.147 killing process with pid 4074667 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 4074667 00:11:28.147 17:05:23 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 4074667 00:11:28.405 00:11:28.405 real 0m2.109s 00:11:28.405 user 0m4.113s 00:11:28.405 sys 0m0.675s 00:11:28.405 17:05:23 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:28.405 17:05:23 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:11:28.405 ************************************ 00:11:28.405 END TEST spdkcli_tcp 00:11:28.405 ************************************ 00:11:28.663 17:05:23 -- common/autotest_common.sh@1142 -- # return 0 00:11:28.663 17:05:23 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:11:28.663 17:05:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:28.663 17:05:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:28.663 17:05:23 -- common/autotest_common.sh@10 -- # set +x 00:11:28.663 ************************************ 00:11:28.663 START TEST dpdk_mem_utility 00:11:28.663 ************************************ 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:11:28.663 * Looking for test storage... 00:11:28.663 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:11:28.663 17:05:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:11:28.663 17:05:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=4075010 00:11:28.663 17:05:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 4075010 00:11:28.663 17:05:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 4075010 ']' 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:28.663 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:28.663 17:05:23 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:11:28.663 [2024-07-23 17:05:24.055737] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:28.663 [2024-07-23 17:05:24.055813] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4075010 ] 00:11:28.922 [2024-07-23 17:05:24.191260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:28.922 [2024-07-23 17:05:24.247976] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.892 17:05:24 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:29.892 17:05:24 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:11:29.892 17:05:24 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:11:29.892 17:05:24 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:11:29.892 17:05:24 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:29.893 17:05:24 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:11:29.893 { 00:11:29.893 "filename": "/tmp/spdk_mem_dump.txt" 00:11:29.893 } 00:11:29.893 17:05:24 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:29.893 17:05:24 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:11:29.893 DPDK memory size 816.000000 MiB in 2 heap(s) 00:11:29.893 2 heaps totaling size 816.000000 MiB 00:11:29.893 size: 814.000000 MiB heap id: 0 00:11:29.893 size: 2.000000 MiB heap id: 1 00:11:29.893 end heaps---------- 00:11:29.893 8 mempools totaling size 598.116089 MiB 00:11:29.893 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:11:29.893 size: 158.602051 MiB name: PDU_data_out_Pool 00:11:29.893 size: 84.521057 MiB name: bdev_io_4075010 00:11:29.893 size: 51.011292 MiB name: evtpool_4075010 00:11:29.893 size: 50.003479 MiB name: msgpool_4075010 00:11:29.893 size: 21.763794 MiB name: PDU_Pool 00:11:29.893 size: 19.513306 MiB name: SCSI_TASK_Pool 00:11:29.893 size: 0.026123 MiB name: Session_Pool 00:11:29.893 end mempools------- 00:11:29.893 201 memzones totaling size 4.173645 MiB 00:11:29.893 size: 1.000366 MiB name: RG_ring_0_4075010 00:11:29.893 size: 1.000366 MiB name: RG_ring_1_4075010 00:11:29.893 size: 1.000366 MiB name: RG_ring_4_4075010 00:11:29.893 size: 1.000366 MiB name: RG_ring_5_4075010 00:11:29.893 size: 0.125366 MiB name: RG_ring_2_4075010 00:11:29.893 size: 0.015991 MiB name: RG_ring_3_4075010 00:11:29.893 size: 0.001282 MiB name: QAT_SYM_CAPA_GEN_1 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.0_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.1_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.2_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.3_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.4_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.5_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.6_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:01.7_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.0_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.1_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.2_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.3_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.4_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.5_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.6_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3d:02.7_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.0_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.1_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.2_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.3_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.4_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.5_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.6_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:01.7_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.0_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.1_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.2_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.3_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.4_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.5_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.6_qat 00:11:29.893 size: 0.000244 MiB name: 0000:3f:02.7_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.0_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.1_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.2_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.3_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.4_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.5_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.6_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:01.7_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.0_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.1_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.2_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.3_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.4_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.5_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.6_qat 00:11:29.893 size: 0.000244 MiB name: 0000:da:02.7_qat 00:11:29.893 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_0 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_0 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_1 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_2 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_1 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_3 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_4 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_2 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_5 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_6 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_3 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_7 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_8 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_4 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_9 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_10 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_5 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_11 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_12 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_6 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_13 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_14 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_7 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_15 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_16 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_8 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_17 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_18 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_9 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_19 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_20 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_10 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_21 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_22 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_11 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_23 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_24 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_12 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_25 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_26 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_13 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_27 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_28 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_14 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_29 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_30 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_15 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_31 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_32 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_16 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_33 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_34 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_17 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_35 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_36 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_18 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_37 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_38 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_19 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_39 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_40 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_20 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_41 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_42 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_21 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_43 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_44 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_22 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_45 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_46 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_23 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_47 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_48 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_24 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_49 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_50 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_25 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_51 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_52 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_26 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_53 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_54 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_27 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_55 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_56 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_28 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_57 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_58 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_29 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_59 00:11:29.893 size: 0.000122 MiB name: rte_cryptodev_data_60 00:11:29.893 size: 0.000122 MiB name: rte_compressdev_data_30 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_61 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_62 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_31 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_63 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_64 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_32 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_65 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_66 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_33 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_67 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_68 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_34 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_69 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_70 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_35 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_71 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_72 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_36 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_73 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_74 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_37 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_75 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_76 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_38 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_77 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_78 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_39 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_79 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_80 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_40 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_81 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_82 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_41 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_83 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_84 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_42 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_85 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_86 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_43 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_87 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_88 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_44 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_89 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_90 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_45 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_91 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_92 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_46 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_93 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_94 00:11:29.894 size: 0.000122 MiB name: rte_compressdev_data_47 00:11:29.894 size: 0.000122 MiB name: rte_cryptodev_data_95 00:11:29.894 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:11:29.894 end memzones------- 00:11:29.894 17:05:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:11:29.894 heap id: 0 total size: 814.000000 MiB number of busy elements: 531 number of free elements: 14 00:11:29.894 list of free elements. size: 11.818298 MiB 00:11:29.894 element at address: 0x200000400000 with size: 1.999512 MiB 00:11:29.894 element at address: 0x200018e00000 with size: 0.999878 MiB 00:11:29.894 element at address: 0x200019000000 with size: 0.999878 MiB 00:11:29.894 element at address: 0x200003e00000 with size: 0.996460 MiB 00:11:29.894 element at address: 0x200031c00000 with size: 0.994446 MiB 00:11:29.894 element at address: 0x200013800000 with size: 0.978882 MiB 00:11:29.894 element at address: 0x200007000000 with size: 0.960022 MiB 00:11:29.894 element at address: 0x200019200000 with size: 0.937256 MiB 00:11:29.894 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:11:29.894 element at address: 0x200003a00000 with size: 0.498535 MiB 00:11:29.894 element at address: 0x20000b200000 with size: 0.491272 MiB 00:11:29.894 element at address: 0x200000800000 with size: 0.486511 MiB 00:11:29.894 element at address: 0x200019400000 with size: 0.485840 MiB 00:11:29.894 element at address: 0x200027e00000 with size: 0.406555 MiB 00:11:29.894 list of standard malloc elements. size: 199.876221 MiB 00:11:29.894 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:11:29.894 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:11:29.894 element at address: 0x200018efff80 with size: 1.000122 MiB 00:11:29.894 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:11:29.894 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:11:29.894 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:11:29.894 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:11:29.894 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:11:29.894 element at address: 0x200000332280 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000335780 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000338c80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000033c180 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000033f680 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000342b80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000346080 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000349580 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000034ca80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000034ff80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000353480 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000356980 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000359e80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000035d380 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000360880 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000363d80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003677c0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000036b200 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000036ec40 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000372680 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003760c0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000379b00 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000037d540 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000380f80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003849c0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000388400 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000038be40 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000038f880 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003932c0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000396d00 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000039a740 with size: 0.004333 MiB 00:11:29.894 element at address: 0x20000039e180 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003a1bc0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003a5600 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003a9040 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003aca80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003b04c0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003b3f00 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003b7940 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003bb380 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003bedc0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003c2800 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003c6240 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003c9c80 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003cd6c0 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003d1100 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003d4b40 with size: 0.004333 MiB 00:11:29.894 element at address: 0x2000003d8d40 with size: 0.004333 MiB 00:11:29.894 element at address: 0x200000330180 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000331200 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000333680 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000334700 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000336b80 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000337c00 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000033a080 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000033b100 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000033d580 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000033e600 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000340a80 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000341b00 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000343f80 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000345000 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000347480 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000348500 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000034a980 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000034ba00 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000034de80 with size: 0.004028 MiB 00:11:29.894 element at address: 0x20000034ef00 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000351380 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000352400 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000354880 with size: 0.004028 MiB 00:11:29.894 element at address: 0x200000355900 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000357d80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000358e00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000035b280 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000035c300 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000035e780 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000035f800 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000361c80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000362d00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003656c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000366740 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000369100 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000036a180 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000036cb40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000036dbc0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000370580 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000371600 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000373fc0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000375040 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000377a00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000378a80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000037b440 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000037c4c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000037ee80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000037ff00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003828c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000383940 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000386300 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000387380 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000389d40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000038adc0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000038d780 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000038e800 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003911c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000392240 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000394c00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000395c80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000398640 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003996c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000039c080 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000039d100 with size: 0.004028 MiB 00:11:29.895 element at address: 0x20000039fac0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003a0b40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003a3500 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003a4580 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003a6f40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003a7fc0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003aa980 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003aba00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003ae3c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003af440 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003b1e00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003b2e80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003b5840 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003b68c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003b9280 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003ba300 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003bccc0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003bdd40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003c0700 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003c1780 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003c4140 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003c51c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003c7b80 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003c8c00 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003cb5c0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003cc640 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003cf000 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003d0080 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003d2a40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003d3ac0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003d6c40 with size: 0.004028 MiB 00:11:29.895 element at address: 0x2000003d7cc0 with size: 0.004028 MiB 00:11:29.895 element at address: 0x200000206700 with size: 0.000305 MiB 00:11:29.895 element at address: 0x200000200000 with size: 0.000244 MiB 00:11:29.895 element at address: 0x200000200100 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002001c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200280 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200340 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200400 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002004c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200580 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200640 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200700 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002007c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200880 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200940 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200a00 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200ac0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200b80 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200c40 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200d00 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200dc0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200e80 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000200f40 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201000 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002010c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201180 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201240 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201300 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002013c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201480 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201540 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201600 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002016c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201780 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201840 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201900 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002019c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201a80 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201b40 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201c00 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201cc0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201d80 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201e40 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201f00 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000201fc0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202080 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202140 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202200 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002022c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202380 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202440 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202500 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002025c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202680 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202740 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202800 with size: 0.000183 MiB 00:11:29.895 element at address: 0x2000002028c0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202980 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202a40 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202b00 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202bc0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202c80 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202d40 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202e00 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202ec0 with size: 0.000183 MiB 00:11:29.895 element at address: 0x200000202f80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203040 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203100 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002031c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203280 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203340 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203400 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002034c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203580 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203640 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203700 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002037c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203880 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203940 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203a00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203ac0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203b80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203c40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203d00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203dc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203e80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000203f40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204000 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002040c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204180 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204240 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204300 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002043c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204480 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204540 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204600 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002046c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204780 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204840 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204900 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002049c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204a80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204b40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204c00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204cc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204d80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204e40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204f00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000204fc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205080 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205140 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205200 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002052c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205380 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205440 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205500 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002055c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205680 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205740 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205800 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002058c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205980 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205a40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205b00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205bc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205c80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205d40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205e00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205ec0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000205f80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206040 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206100 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002061c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206280 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206340 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206400 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206580 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206640 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206840 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206900 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002069c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206a80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206b40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206c00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206cc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206d80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206e40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206f00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000206fc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207080 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207140 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207200 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002072c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207380 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207440 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207500 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002075c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207680 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207740 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207800 with size: 0.000183 MiB 00:11:29.896 element at address: 0x2000002078c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207980 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207a40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000207c40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000020bf00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c1c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c280 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c340 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c400 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c4c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c580 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c640 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c700 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c7c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c880 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022c940 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022ca00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022cac0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022cb80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022cc40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022cd00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022cdc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022ce80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d080 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d140 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d200 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d2c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d380 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d440 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d500 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d5c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d680 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d740 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d800 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d8c0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022d980 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022da40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022db00 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022dbc0 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000022dc80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000032fe80 with size: 0.000183 MiB 00:11:29.896 element at address: 0x20000032ff40 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000333440 with size: 0.000183 MiB 00:11:29.896 element at address: 0x200000336940 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000339e40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000033d340 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000340840 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000343d40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000347240 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000034a740 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000034dc40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000351140 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000354640 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000357b40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000035b040 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000035e540 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000361a40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000364f40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000365100 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003652c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000365380 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000368980 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000368b40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000368d00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000368dc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000036c3c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000036c580 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000036c740 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000036c800 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000036fe00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000036ffc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000370180 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000370240 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000373840 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000373a00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000373bc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000373c80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000377280 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000377440 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000377600 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003776c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037acc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037ae80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037b040 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037b100 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037e700 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037e8c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037ea80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000037eb40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000382140 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000382300 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003824c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000382580 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000385b80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000385d40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000385f00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000385fc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003895c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000389780 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000389940 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000389a00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000038d000 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000038d1c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000038d380 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000038d440 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000390a40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000390c00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000390dc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000390e80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000394480 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000394640 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000394800 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003948c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000397ec0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000398080 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000398240 with size: 0.000183 MiB 00:11:29.897 element at address: 0x200000398300 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039b900 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039bac0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039bc80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039bd40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039f340 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039f500 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039f6c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x20000039f780 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a2d80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a2f40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a3100 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a31c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a67c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a6980 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a6b40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003a6c00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003aa200 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003aa3c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003aa580 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003aa640 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003adc40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003ade00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003adfc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003ae080 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b1680 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b1840 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b1a00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b1ac0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b50c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b5280 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b5440 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b5500 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b8b00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b8cc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b8e80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003b8f40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003bc540 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003bc700 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003bc8c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003bc980 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003bff80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c0140 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c0300 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c03c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c3d40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c3e00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c7400 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c75c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c7780 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003c7840 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cae40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cb000 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cb1c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cb280 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003ce880 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cea40 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cec00 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003cecc0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003d22c0 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003d2480 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003d2640 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003d2700 with size: 0.000183 MiB 00:11:29.897 element at address: 0x2000003d5e40 with size: 0.000183 MiB 00:11:29.898 element at address: 0x2000003d60c0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x2000003d6840 with size: 0.000183 MiB 00:11:29.898 element at address: 0x2000003d6900 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087c980 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:11:29.898 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e68140 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e68200 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6ee00 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:11:29.898 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:11:29.898 list of memzone associated elements. size: 602.305481 MiB 00:11:29.898 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:11:29.898 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:11:29.898 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:11:29.898 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:11:29.898 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:11:29.898 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_4075010_0 00:11:29.898 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:11:29.898 associated memzone info: size: 48.002930 MiB name: MP_evtpool_4075010_0 00:11:29.898 element at address: 0x200003fff380 with size: 48.003052 MiB 00:11:29.898 associated memzone info: size: 48.002930 MiB name: MP_msgpool_4075010_0 00:11:29.898 element at address: 0x2000195be940 with size: 20.255554 MiB 00:11:29.898 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:11:29.898 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:11:29.898 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:11:29.898 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:11:29.898 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_4075010 00:11:29.898 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:11:29.898 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_4075010 00:11:29.898 element at address: 0x20000022dd40 with size: 1.008118 MiB 00:11:29.898 associated memzone info: size: 1.007996 MiB name: MP_evtpool_4075010 00:11:29.898 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:11:29.898 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:11:29.898 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:11:29.898 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:11:29.898 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:11:29.898 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:11:29.898 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:11:29.898 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:11:29.898 element at address: 0x200003eff180 with size: 1.000488 MiB 00:11:29.898 associated memzone info: size: 1.000366 MiB name: RG_ring_0_4075010 00:11:29.898 element at address: 0x200003affc00 with size: 1.000488 MiB 00:11:29.898 associated memzone info: size: 1.000366 MiB name: RG_ring_1_4075010 00:11:29.898 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:11:29.898 associated memzone info: size: 1.000366 MiB name: RG_ring_4_4075010 00:11:29.898 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:11:29.898 associated memzone info: size: 1.000366 MiB name: RG_ring_5_4075010 00:11:29.898 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:11:29.898 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_4075010 00:11:29.898 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:11:29.898 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:11:29.898 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:11:29.898 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:11:29.898 element at address: 0x20001947c600 with size: 0.250488 MiB 00:11:29.898 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:11:29.898 element at address: 0x20000020bfc0 with size: 0.125488 MiB 00:11:29.898 associated memzone info: size: 0.125366 MiB name: RG_ring_2_4075010 00:11:29.898 element at address: 0x2000070f5c40 with size: 0.031738 MiB 00:11:29.898 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:11:29.898 element at address: 0x200027e682c0 with size: 0.023743 MiB 00:11:29.898 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:11:29.898 element at address: 0x200000207d00 with size: 0.016113 MiB 00:11:29.898 associated memzone info: size: 0.015991 MiB name: RG_ring_3_4075010 00:11:29.898 element at address: 0x200027e6e400 with size: 0.002441 MiB 00:11:29.898 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:11:29.898 element at address: 0x2000003d6280 with size: 0.001404 MiB 00:11:29.898 associated memzone info: size: 0.001282 MiB name: QAT_SYM_CAPA_GEN_1 00:11:29.898 element at address: 0x2000003d6ac0 with size: 0.000366 MiB 00:11:29.898 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.0_qat 00:11:29.898 element at address: 0x2000003d28c0 with size: 0.000366 MiB 00:11:29.898 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.1_qat 00:11:29.898 element at address: 0x2000003cee80 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.2_qat 00:11:29.899 element at address: 0x2000003cb440 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.3_qat 00:11:29.899 element at address: 0x2000003c7a00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.4_qat 00:11:29.899 element at address: 0x2000003c3fc0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.5_qat 00:11:29.899 element at address: 0x2000003c0580 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.6_qat 00:11:29.899 element at address: 0x2000003bcb40 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:01.7_qat 00:11:29.899 element at address: 0x2000003b9100 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.0_qat 00:11:29.899 element at address: 0x2000003b56c0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.1_qat 00:11:29.899 element at address: 0x2000003b1c80 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.2_qat 00:11:29.899 element at address: 0x2000003ae240 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.3_qat 00:11:29.899 element at address: 0x2000003aa800 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.4_qat 00:11:29.899 element at address: 0x2000003a6dc0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.5_qat 00:11:29.899 element at address: 0x2000003a3380 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.6_qat 00:11:29.899 element at address: 0x20000039f940 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3d:02.7_qat 00:11:29.899 element at address: 0x20000039bf00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.0_qat 00:11:29.899 element at address: 0x2000003984c0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.1_qat 00:11:29.899 element at address: 0x200000394a80 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.2_qat 00:11:29.899 element at address: 0x200000391040 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.3_qat 00:11:29.899 element at address: 0x20000038d600 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.4_qat 00:11:29.899 element at address: 0x200000389bc0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.5_qat 00:11:29.899 element at address: 0x200000386180 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.6_qat 00:11:29.899 element at address: 0x200000382740 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:01.7_qat 00:11:29.899 element at address: 0x20000037ed00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.0_qat 00:11:29.899 element at address: 0x20000037b2c0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.1_qat 00:11:29.899 element at address: 0x200000377880 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.2_qat 00:11:29.899 element at address: 0x200000373e40 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.3_qat 00:11:29.899 element at address: 0x200000370400 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.4_qat 00:11:29.899 element at address: 0x20000036c9c0 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.5_qat 00:11:29.899 element at address: 0x200000368f80 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.6_qat 00:11:29.899 element at address: 0x200000365540 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:3f:02.7_qat 00:11:29.899 element at address: 0x200000361b00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.0_qat 00:11:29.899 element at address: 0x20000035e600 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.1_qat 00:11:29.899 element at address: 0x20000035b100 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.2_qat 00:11:29.899 element at address: 0x200000357c00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.3_qat 00:11:29.899 element at address: 0x200000354700 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.4_qat 00:11:29.899 element at address: 0x200000351200 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.5_qat 00:11:29.899 element at address: 0x20000034dd00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.6_qat 00:11:29.899 element at address: 0x20000034a800 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:01.7_qat 00:11:29.899 element at address: 0x200000347300 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.0_qat 00:11:29.899 element at address: 0x200000343e00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.1_qat 00:11:29.899 element at address: 0x200000340900 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.2_qat 00:11:29.899 element at address: 0x20000033d400 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.3_qat 00:11:29.899 element at address: 0x200000339f00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.4_qat 00:11:29.899 element at address: 0x200000336a00 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.5_qat 00:11:29.899 element at address: 0x200000333500 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.6_qat 00:11:29.899 element at address: 0x200000330000 with size: 0.000366 MiB 00:11:29.899 associated memzone info: size: 0.000244 MiB name: 0000:da:02.7_qat 00:11:29.899 element at address: 0x2000003d5d00 with size: 0.000305 MiB 00:11:29.899 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:11:29.899 element at address: 0x20000022cf40 with size: 0.000305 MiB 00:11:29.899 associated memzone info: size: 0.000183 MiB name: MP_msgpool_4075010 00:11:29.899 element at address: 0x200000207b00 with size: 0.000305 MiB 00:11:29.899 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_4075010 00:11:29.899 element at address: 0x200027e6eec0 with size: 0.000305 MiB 00:11:29.899 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:11:29.899 element at address: 0x2000003d69c0 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:11:29.899 element at address: 0x2000003d6180 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:11:29.899 element at address: 0x2000003d5f00 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:11:29.899 element at address: 0x2000003d27c0 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:11:29.899 element at address: 0x2000003d2540 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:11:29.899 element at address: 0x2000003d2380 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:11:29.899 element at address: 0x2000003ced80 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:11:29.899 element at address: 0x2000003ceb00 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:11:29.899 element at address: 0x2000003ce940 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:11:29.899 element at address: 0x2000003cb340 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:11:29.899 element at address: 0x2000003cb0c0 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:11:29.899 element at address: 0x2000003caf00 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:11:29.899 element at address: 0x2000003c7900 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:11:29.899 element at address: 0x2000003c7680 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:11:29.899 element at address: 0x2000003c74c0 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:11:29.899 element at address: 0x2000003c3ec0 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:11:29.899 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:11:29.899 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:11:29.899 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:11:29.899 element at address: 0x2000003c0480 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:11:29.900 element at address: 0x2000003c0200 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:11:29.900 element at address: 0x2000003c0040 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:11:29.900 element at address: 0x2000003bca40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:11:29.900 element at address: 0x2000003bc7c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:11:29.900 element at address: 0x2000003bc600 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:11:29.900 element at address: 0x2000003b9000 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:11:29.900 element at address: 0x2000003b8d80 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:11:29.900 element at address: 0x2000003b8bc0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:11:29.900 element at address: 0x2000003b55c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:11:29.900 element at address: 0x2000003b5340 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:11:29.900 element at address: 0x2000003b5180 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:11:29.900 element at address: 0x2000003b1b80 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:11:29.900 element at address: 0x2000003b1900 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:11:29.900 element at address: 0x2000003b1740 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:11:29.900 element at address: 0x2000003ae140 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:11:29.900 element at address: 0x2000003adec0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:11:29.900 element at address: 0x2000003add00 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:11:29.900 element at address: 0x2000003aa700 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:11:29.900 element at address: 0x2000003aa480 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:11:29.900 element at address: 0x2000003aa2c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:11:29.900 element at address: 0x2000003a6cc0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:11:29.900 element at address: 0x2000003a6a40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:11:29.900 element at address: 0x2000003a6880 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:11:29.900 element at address: 0x2000003a3280 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:11:29.900 element at address: 0x2000003a3000 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:11:29.900 element at address: 0x2000003a2e40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:11:29.900 element at address: 0x20000039f840 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:11:29.900 element at address: 0x20000039f5c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:11:29.900 element at address: 0x20000039f400 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:11:29.900 element at address: 0x20000039be00 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:11:29.900 element at address: 0x20000039bb80 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:11:29.900 element at address: 0x20000039b9c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:11:29.900 element at address: 0x2000003983c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:11:29.900 element at address: 0x200000398140 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:11:29.900 element at address: 0x200000397f80 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:11:29.900 element at address: 0x200000394980 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:11:29.900 element at address: 0x200000394700 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:11:29.900 element at address: 0x200000394540 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:11:29.900 element at address: 0x200000390f40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:11:29.900 element at address: 0x200000390cc0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:11:29.900 element at address: 0x200000390b00 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:11:29.900 element at address: 0x20000038d500 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:11:29.900 element at address: 0x20000038d280 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:11:29.900 element at address: 0x20000038d0c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:11:29.900 element at address: 0x200000389ac0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:11:29.900 element at address: 0x200000389840 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:11:29.900 element at address: 0x200000389680 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:11:29.900 element at address: 0x200000386080 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:11:29.900 element at address: 0x200000385e00 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:11:29.900 element at address: 0x200000385c40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:11:29.900 element at address: 0x200000382640 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:11:29.900 element at address: 0x2000003823c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:11:29.900 element at address: 0x200000382200 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:11:29.900 element at address: 0x20000037ec00 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:11:29.900 element at address: 0x20000037e980 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:11:29.900 element at address: 0x20000037e7c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:11:29.900 element at address: 0x20000037b1c0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:11:29.900 element at address: 0x20000037af40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:11:29.900 element at address: 0x20000037ad80 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:11:29.900 element at address: 0x200000377780 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:11:29.900 element at address: 0x200000377500 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:11:29.900 element at address: 0x200000377340 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:11:29.900 element at address: 0x200000373d40 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:11:29.900 element at address: 0x200000373ac0 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:11:29.900 element at address: 0x200000373900 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:11:29.900 element at address: 0x200000370300 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:11:29.900 element at address: 0x200000370080 with size: 0.000244 MiB 00:11:29.900 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:11:29.901 element at address: 0x20000036fec0 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:11:29.901 element at address: 0x20000036c8c0 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:11:29.901 element at address: 0x20000036c640 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:11:29.901 element at address: 0x20000036c480 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:11:29.901 element at address: 0x200000368e80 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:11:29.901 element at address: 0x200000368c00 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:11:29.901 element at address: 0x200000368a40 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:11:29.901 element at address: 0x200000365440 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:11:29.901 element at address: 0x2000003651c0 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:11:29.901 element at address: 0x200000365000 with size: 0.000244 MiB 00:11:29.901 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:11:29.901 element at address: 0x2000003d6000 with size: 0.000183 MiB 00:11:29.901 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:11:29.901 17:05:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:11:29.901 17:05:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 4075010 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 4075010 ']' 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 4075010 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4075010 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4075010' 00:11:29.901 killing process with pid 4075010 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 4075010 00:11:29.901 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 4075010 00:11:30.468 00:11:30.468 real 0m1.738s 00:11:30.468 user 0m1.877s 00:11:30.468 sys 0m0.572s 00:11:30.468 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:30.468 17:05:25 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:11:30.468 ************************************ 00:11:30.468 END TEST dpdk_mem_utility 00:11:30.468 ************************************ 00:11:30.468 17:05:25 -- common/autotest_common.sh@1142 -- # return 0 00:11:30.468 17:05:25 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:11:30.468 17:05:25 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:30.468 17:05:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:30.468 17:05:25 -- common/autotest_common.sh@10 -- # set +x 00:11:30.468 ************************************ 00:11:30.468 START TEST event 00:11:30.468 ************************************ 00:11:30.468 17:05:25 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:11:30.468 * Looking for test storage... 00:11:30.468 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:11:30.468 17:05:25 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:30.468 17:05:25 event -- bdev/nbd_common.sh@6 -- # set -e 00:11:30.468 17:05:25 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:11:30.468 17:05:25 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:11:30.468 17:05:25 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:30.468 17:05:25 event -- common/autotest_common.sh@10 -- # set +x 00:11:30.468 ************************************ 00:11:30.468 START TEST event_perf 00:11:30.468 ************************************ 00:11:30.468 17:05:25 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:11:30.468 Running I/O for 1 seconds...[2024-07-23 17:05:25.875652] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:30.468 [2024-07-23 17:05:25.875718] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4075308 ] 00:11:30.727 [2024-07-23 17:05:26.005495] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:30.727 [2024-07-23 17:05:26.063291] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:30.727 [2024-07-23 17:05:26.063392] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:30.727 [2024-07-23 17:05:26.063494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:11:30.727 [2024-07-23 17:05:26.063495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:32.104 Running I/O for 1 seconds... 00:11:32.104 lcore 0: 102653 00:11:32.104 lcore 1: 102654 00:11:32.104 lcore 2: 102656 00:11:32.104 lcore 3: 102655 00:11:32.104 done. 00:11:32.104 00:11:32.104 real 0m1.296s 00:11:32.104 user 0m4.143s 00:11:32.104 sys 0m0.142s 00:11:32.104 17:05:27 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:32.104 17:05:27 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:11:32.104 ************************************ 00:11:32.104 END TEST event_perf 00:11:32.104 ************************************ 00:11:32.104 17:05:27 event -- common/autotest_common.sh@1142 -- # return 0 00:11:32.104 17:05:27 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:11:32.104 17:05:27 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:32.104 17:05:27 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:32.104 17:05:27 event -- common/autotest_common.sh@10 -- # set +x 00:11:32.104 ************************************ 00:11:32.104 START TEST event_reactor 00:11:32.104 ************************************ 00:11:32.104 17:05:27 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:11:32.104 [2024-07-23 17:05:27.254267] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:32.104 [2024-07-23 17:05:27.254345] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4075516 ] 00:11:32.104 [2024-07-23 17:05:27.384672] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:32.104 [2024-07-23 17:05:27.437531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:33.483 test_start 00:11:33.483 oneshot 00:11:33.483 tick 100 00:11:33.483 tick 100 00:11:33.483 tick 250 00:11:33.483 tick 100 00:11:33.483 tick 100 00:11:33.483 tick 250 00:11:33.483 tick 100 00:11:33.483 tick 500 00:11:33.483 tick 100 00:11:33.483 tick 100 00:11:33.483 tick 250 00:11:33.483 tick 100 00:11:33.483 tick 100 00:11:33.483 test_end 00:11:33.483 00:11:33.483 real 0m1.288s 00:11:33.483 user 0m1.132s 00:11:33.483 sys 0m0.151s 00:11:33.483 17:05:28 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:33.483 17:05:28 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:11:33.483 ************************************ 00:11:33.483 END TEST event_reactor 00:11:33.483 ************************************ 00:11:33.483 17:05:28 event -- common/autotest_common.sh@1142 -- # return 0 00:11:33.483 17:05:28 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:11:33.483 17:05:28 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:33.483 17:05:28 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:33.483 17:05:28 event -- common/autotest_common.sh@10 -- # set +x 00:11:33.483 ************************************ 00:11:33.483 START TEST event_reactor_perf 00:11:33.483 ************************************ 00:11:33.483 17:05:28 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:11:33.483 [2024-07-23 17:05:28.629923] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:33.483 [2024-07-23 17:05:28.629986] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4075720 ] 00:11:33.483 [2024-07-23 17:05:28.759861] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:33.483 [2024-07-23 17:05:28.813047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.862 test_start 00:11:34.862 test_end 00:11:34.862 Performance: 327362 events per second 00:11:34.862 00:11:34.862 real 0m1.289s 00:11:34.862 user 0m1.149s 00:11:34.862 sys 0m0.134s 00:11:34.862 17:05:29 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:34.862 17:05:29 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:11:34.862 ************************************ 00:11:34.862 END TEST event_reactor_perf 00:11:34.862 ************************************ 00:11:34.862 17:05:29 event -- common/autotest_common.sh@1142 -- # return 0 00:11:34.862 17:05:29 event -- event/event.sh@49 -- # uname -s 00:11:34.862 17:05:29 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:11:34.862 17:05:29 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:11:34.862 17:05:29 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:34.862 17:05:29 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.862 17:05:29 event -- common/autotest_common.sh@10 -- # set +x 00:11:34.862 ************************************ 00:11:34.862 START TEST event_scheduler 00:11:34.862 ************************************ 00:11:34.862 17:05:29 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:11:34.862 * Looking for test storage... 00:11:34.862 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:11:34.862 17:05:30 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:11:34.862 17:05:30 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=4076014 00:11:34.862 17:05:30 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:11:34.862 17:05:30 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 4076014 00:11:34.862 17:05:30 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 4076014 ']' 00:11:34.862 17:05:30 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:34.862 17:05:30 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:11:34.862 17:05:30 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:34.862 17:05:30 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:34.862 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:34.862 17:05:30 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:34.862 17:05:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:11:34.862 [2024-07-23 17:05:30.163172] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:34.862 [2024-07-23 17:05:30.163244] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4076014 ] 00:11:35.122 [2024-07-23 17:05:30.360148] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:35.122 [2024-07-23 17:05:30.445400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:35.122 [2024-07-23 17:05:30.445484] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:35.122 [2024-07-23 17:05:30.445584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:11:35.122 [2024-07-23 17:05:30.445595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:11:35.691 17:05:31 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:11:35.691 [2024-07-23 17:05:31.044972] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:11:35.691 [2024-07-23 17:05:31.045029] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:11:35.691 [2024-07-23 17:05:31.045064] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:11:35.691 [2024-07-23 17:05:31.045090] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:11:35.691 [2024-07-23 17:05:31.045114] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.691 17:05:31 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.691 17:05:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 [2024-07-23 17:05:31.166843] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:11:35.951 17:05:31 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:11:35.951 17:05:31 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:35.951 17:05:31 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 ************************************ 00:11:35.951 START TEST scheduler_create_thread 00:11:35.951 ************************************ 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 2 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 3 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 4 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 5 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 6 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 7 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 8 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 9 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:35.951 10 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:35.951 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:36.519 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:36.519 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:11:36.519 17:05:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:11:36.519 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:36.519 17:05:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:37.457 17:05:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:37.457 17:05:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:11:37.457 17:05:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:37.457 17:05:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:38.394 17:05:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:38.394 17:05:33 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:11:38.394 17:05:33 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:11:38.394 17:05:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:38.394 17:05:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:39.329 17:05:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:39.329 00:11:39.329 real 0m3.233s 00:11:39.329 user 0m0.021s 00:11:39.329 sys 0m0.010s 00:11:39.329 17:05:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:39.329 17:05:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:11:39.329 ************************************ 00:11:39.329 END TEST scheduler_create_thread 00:11:39.329 ************************************ 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:11:39.329 17:05:34 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:11:39.329 17:05:34 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 4076014 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 4076014 ']' 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 4076014 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4076014 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4076014' 00:11:39.329 killing process with pid 4076014 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 4076014 00:11:39.329 17:05:34 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 4076014 00:11:39.589 [2024-07-23 17:05:34.822177] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:11:39.849 00:11:39.849 real 0m5.191s 00:11:39.849 user 0m10.000s 00:11:39.849 sys 0m0.646s 00:11:39.849 17:05:35 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:39.849 17:05:35 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:11:39.849 ************************************ 00:11:39.849 END TEST event_scheduler 00:11:39.849 ************************************ 00:11:39.849 17:05:35 event -- common/autotest_common.sh@1142 -- # return 0 00:11:39.849 17:05:35 event -- event/event.sh@51 -- # modprobe -n nbd 00:11:39.849 17:05:35 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:11:39.849 17:05:35 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:39.849 17:05:35 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:39.849 17:05:35 event -- common/autotest_common.sh@10 -- # set +x 00:11:40.108 ************************************ 00:11:40.108 START TEST app_repeat 00:11:40.108 ************************************ 00:11:40.108 17:05:35 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@19 -- # repeat_pid=4076646 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 4076646' 00:11:40.108 Process app_repeat pid: 4076646 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:11:40.108 17:05:35 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:11:40.109 spdk_app_start Round 0 00:11:40.109 17:05:35 event.app_repeat -- event/event.sh@25 -- # waitforlisten 4076646 /var/tmp/spdk-nbd.sock 00:11:40.109 17:05:35 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:11:40.109 17:05:35 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4076646 ']' 00:11:40.109 17:05:35 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:11:40.109 17:05:35 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:40.109 17:05:35 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:11:40.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:11:40.109 17:05:35 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:40.109 17:05:35 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:11:40.109 [2024-07-23 17:05:35.331617] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:40.109 [2024-07-23 17:05:35.331745] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4076646 ] 00:11:40.109 [2024-07-23 17:05:35.529394] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:40.368 [2024-07-23 17:05:35.584629] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:40.368 [2024-07-23 17:05:35.584634] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:40.935 17:05:36 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:40.935 17:05:36 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:11:40.935 17:05:36 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:11:41.193 Malloc0 00:11:41.193 17:05:36 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:11:41.452 Malloc1 00:11:41.452 17:05:36 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:41.452 17:05:36 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:11:41.710 /dev/nbd0 00:11:41.710 17:05:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:41.710 17:05:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:41.710 17:05:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:41.710 17:05:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:11:41.710 17:05:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:41.710 17:05:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:41.710 17:05:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:11:41.711 1+0 records in 00:11:41.711 1+0 records out 00:11:41.711 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256702 s, 16.0 MB/s 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:41.711 17:05:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:11:41.711 17:05:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:41.711 17:05:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:41.711 17:05:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:11:41.969 /dev/nbd1 00:11:41.969 17:05:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:11:41.969 17:05:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:11:41.969 1+0 records in 00:11:41.969 1+0 records out 00:11:41.969 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208756 s, 19.6 MB/s 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:41.969 17:05:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:11:41.969 17:05:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:41.969 17:05:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:41.970 17:05:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:41.970 17:05:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:41.970 17:05:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:42.228 17:05:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:42.228 { 00:11:42.228 "nbd_device": "/dev/nbd0", 00:11:42.228 "bdev_name": "Malloc0" 00:11:42.228 }, 00:11:42.228 { 00:11:42.228 "nbd_device": "/dev/nbd1", 00:11:42.228 "bdev_name": "Malloc1" 00:11:42.228 } 00:11:42.228 ]' 00:11:42.228 17:05:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:42.228 { 00:11:42.228 "nbd_device": "/dev/nbd0", 00:11:42.228 "bdev_name": "Malloc0" 00:11:42.228 }, 00:11:42.228 { 00:11:42.228 "nbd_device": "/dev/nbd1", 00:11:42.228 "bdev_name": "Malloc1" 00:11:42.228 } 00:11:42.228 ]' 00:11:42.228 17:05:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:42.228 17:05:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:11:42.228 /dev/nbd1' 00:11:42.228 17:05:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:11:42.228 /dev/nbd1' 00:11:42.228 17:05:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:11:42.487 256+0 records in 00:11:42.487 256+0 records out 00:11:42.487 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107237 s, 97.8 MB/s 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:11:42.487 256+0 records in 00:11:42.487 256+0 records out 00:11:42.487 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0297813 s, 35.2 MB/s 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:11:42.487 256+0 records in 00:11:42.487 256+0 records out 00:11:42.487 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0314819 s, 33.3 MB/s 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:42.487 17:05:37 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:42.488 17:05:37 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:42.746 17:05:37 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:43.006 17:05:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:11:43.264 17:05:38 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:11:43.265 17:05:38 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:11:43.265 17:05:38 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:11:43.523 17:05:38 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:11:43.781 [2024-07-23 17:05:39.001254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:43.781 [2024-07-23 17:05:39.054743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:43.781 [2024-07-23 17:05:39.054750] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:43.781 [2024-07-23 17:05:39.106846] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:11:43.781 [2024-07-23 17:05:39.106904] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:11:47.062 17:05:41 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:11:47.062 17:05:41 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:11:47.062 spdk_app_start Round 1 00:11:47.062 17:05:41 event.app_repeat -- event/event.sh@25 -- # waitforlisten 4076646 /var/tmp/spdk-nbd.sock 00:11:47.062 17:05:41 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4076646 ']' 00:11:47.062 17:05:41 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:11:47.062 17:05:41 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:47.062 17:05:41 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:11:47.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:11:47.062 17:05:41 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:47.062 17:05:41 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:11:47.062 17:05:42 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:47.062 17:05:42 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:11:47.062 17:05:42 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:11:47.062 Malloc0 00:11:47.062 17:05:42 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:11:47.321 Malloc1 00:11:47.321 17:05:42 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:47.321 17:05:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:47.322 17:05:42 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:11:47.888 /dev/nbd0 00:11:47.888 17:05:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:47.888 17:05:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:11:47.888 1+0 records in 00:11:47.888 1+0 records out 00:11:47.888 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221685 s, 18.5 MB/s 00:11:47.888 17:05:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:47.889 17:05:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:11:47.889 17:05:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:47.889 17:05:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:47.889 17:05:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:11:47.889 17:05:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:47.889 17:05:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:47.889 17:05:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:11:48.146 /dev/nbd1 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:11:48.146 1+0 records in 00:11:48.146 1+0 records out 00:11:48.146 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253556 s, 16.2 MB/s 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:48.146 17:05:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:48.146 17:05:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:48.406 17:05:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:48.406 { 00:11:48.406 "nbd_device": "/dev/nbd0", 00:11:48.406 "bdev_name": "Malloc0" 00:11:48.406 }, 00:11:48.406 { 00:11:48.406 "nbd_device": "/dev/nbd1", 00:11:48.406 "bdev_name": "Malloc1" 00:11:48.406 } 00:11:48.406 ]' 00:11:48.406 17:05:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:48.406 { 00:11:48.406 "nbd_device": "/dev/nbd0", 00:11:48.406 "bdev_name": "Malloc0" 00:11:48.406 }, 00:11:48.406 { 00:11:48.406 "nbd_device": "/dev/nbd1", 00:11:48.406 "bdev_name": "Malloc1" 00:11:48.406 } 00:11:48.406 ]' 00:11:48.406 17:05:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:48.406 17:05:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:11:48.406 /dev/nbd1' 00:11:48.406 17:05:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:11:48.406 /dev/nbd1' 00:11:48.406 17:05:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:11:48.407 256+0 records in 00:11:48.407 256+0 records out 00:11:48.407 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104531 s, 100 MB/s 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:11:48.407 256+0 records in 00:11:48.407 256+0 records out 00:11:48.407 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0298228 s, 35.2 MB/s 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:11:48.407 256+0 records in 00:11:48.407 256+0 records out 00:11:48.407 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0312348 s, 33.6 MB/s 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:48.407 17:05:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:48.683 17:05:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:48.942 17:05:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:11:49.201 17:05:44 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:11:49.201 17:05:44 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:11:49.460 17:05:44 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:11:49.720 [2024-07-23 17:05:45.008449] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:49.720 [2024-07-23 17:05:45.060087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:49.720 [2024-07-23 17:05:45.060093] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:49.720 [2024-07-23 17:05:45.113526] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:11:49.720 [2024-07-23 17:05:45.113579] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:11:53.008 17:05:47 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:11:53.008 17:05:47 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:11:53.008 spdk_app_start Round 2 00:11:53.008 17:05:47 event.app_repeat -- event/event.sh@25 -- # waitforlisten 4076646 /var/tmp/spdk-nbd.sock 00:11:53.008 17:05:47 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4076646 ']' 00:11:53.008 17:05:47 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:11:53.008 17:05:47 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:53.008 17:05:47 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:11:53.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:11:53.008 17:05:47 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:53.008 17:05:47 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:11:53.008 17:05:48 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:53.008 17:05:48 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:11:53.008 17:05:48 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:11:53.008 Malloc0 00:11:53.008 17:05:48 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:11:53.268 Malloc1 00:11:53.268 17:05:48 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:53.268 17:05:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:11:53.528 /dev/nbd0 00:11:53.528 17:05:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:53.528 17:05:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:11:53.528 1+0 records in 00:11:53.528 1+0 records out 00:11:53.528 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247877 s, 16.5 MB/s 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:53.528 17:05:48 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:11:53.528 17:05:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:53.528 17:05:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:53.528 17:05:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:11:53.787 /dev/nbd1 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:11:53.787 1+0 records in 00:11:53.787 1+0 records out 00:11:53.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276586 s, 14.8 MB/s 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:53.787 17:05:49 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:53.787 17:05:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:54.046 17:05:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:54.046 { 00:11:54.046 "nbd_device": "/dev/nbd0", 00:11:54.046 "bdev_name": "Malloc0" 00:11:54.046 }, 00:11:54.046 { 00:11:54.046 "nbd_device": "/dev/nbd1", 00:11:54.046 "bdev_name": "Malloc1" 00:11:54.046 } 00:11:54.046 ]' 00:11:54.046 17:05:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:54.046 { 00:11:54.046 "nbd_device": "/dev/nbd0", 00:11:54.046 "bdev_name": "Malloc0" 00:11:54.046 }, 00:11:54.046 { 00:11:54.046 "nbd_device": "/dev/nbd1", 00:11:54.046 "bdev_name": "Malloc1" 00:11:54.046 } 00:11:54.046 ]' 00:11:54.046 17:05:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:11:54.047 /dev/nbd1' 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:11:54.047 /dev/nbd1' 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:11:54.047 256+0 records in 00:11:54.047 256+0 records out 00:11:54.047 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113124 s, 92.7 MB/s 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.047 17:05:49 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:11:54.306 256+0 records in 00:11:54.306 256+0 records out 00:11:54.306 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0296205 s, 35.4 MB/s 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:11:54.306 256+0 records in 00:11:54.306 256+0 records out 00:11:54.306 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0310946 s, 33.7 MB/s 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:54.306 17:05:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:54.565 17:05:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:54.824 17:05:50 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:54.824 17:05:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:54.824 17:05:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:11:55.083 17:05:50 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:11:55.083 17:05:50 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:11:55.342 17:05:50 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:11:55.602 [2024-07-23 17:05:50.780993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:55.602 [2024-07-23 17:05:50.829909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:55.602 [2024-07-23 17:05:50.829909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:55.602 [2024-07-23 17:05:50.876036] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:11:55.602 [2024-07-23 17:05:50.876086] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:11:58.892 17:05:53 event.app_repeat -- event/event.sh@38 -- # waitforlisten 4076646 /var/tmp/spdk-nbd.sock 00:11:58.892 17:05:53 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 4076646 ']' 00:11:58.892 17:05:53 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:11:58.892 17:05:53 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:58.892 17:05:53 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:11:58.892 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:11:58.892 17:05:53 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:11:58.893 17:05:53 event.app_repeat -- event/event.sh@39 -- # killprocess 4076646 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 4076646 ']' 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 4076646 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4076646 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4076646' 00:11:58.893 killing process with pid 4076646 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@967 -- # kill 4076646 00:11:58.893 17:05:53 event.app_repeat -- common/autotest_common.sh@972 -- # wait 4076646 00:11:58.893 spdk_app_start is called in Round 0. 00:11:58.893 Shutdown signal received, stop current app iteration 00:11:58.893 Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 reinitialization... 00:11:58.893 spdk_app_start is called in Round 1. 00:11:58.893 Shutdown signal received, stop current app iteration 00:11:58.893 Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 reinitialization... 00:11:58.893 spdk_app_start is called in Round 2. 00:11:58.893 Shutdown signal received, stop current app iteration 00:11:58.893 Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 reinitialization... 00:11:58.893 spdk_app_start is called in Round 3. 00:11:58.893 Shutdown signal received, stop current app iteration 00:11:58.893 17:05:54 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:11:58.893 17:05:54 event.app_repeat -- event/event.sh@42 -- # return 0 00:11:58.893 00:11:58.893 real 0m18.791s 00:11:58.893 user 0m40.880s 00:11:58.893 sys 0m3.868s 00:11:58.893 17:05:54 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:58.893 17:05:54 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:11:58.893 ************************************ 00:11:58.893 END TEST app_repeat 00:11:58.893 ************************************ 00:11:58.893 17:05:54 event -- common/autotest_common.sh@1142 -- # return 0 00:11:58.893 17:05:54 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:11:58.893 00:11:58.893 real 0m28.414s 00:11:58.893 user 0m57.478s 00:11:58.893 sys 0m5.368s 00:11:58.893 17:05:54 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:58.893 17:05:54 event -- common/autotest_common.sh@10 -- # set +x 00:11:58.893 ************************************ 00:11:58.893 END TEST event 00:11:58.893 ************************************ 00:11:58.893 17:05:54 -- common/autotest_common.sh@1142 -- # return 0 00:11:58.893 17:05:54 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:11:58.893 17:05:54 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:58.893 17:05:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:58.893 17:05:54 -- common/autotest_common.sh@10 -- # set +x 00:11:58.893 ************************************ 00:11:58.893 START TEST thread 00:11:58.893 ************************************ 00:11:58.893 17:05:54 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:11:58.893 * Looking for test storage... 00:11:59.152 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:11:59.152 17:05:54 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:11:59.152 17:05:54 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:11:59.152 17:05:54 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:59.152 17:05:54 thread -- common/autotest_common.sh@10 -- # set +x 00:11:59.152 ************************************ 00:11:59.152 START TEST thread_poller_perf 00:11:59.152 ************************************ 00:11:59.152 17:05:54 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:11:59.152 [2024-07-23 17:05:54.388757] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:11:59.152 [2024-07-23 17:05:54.388833] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4079440 ] 00:11:59.152 [2024-07-23 17:05:54.519725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:59.152 [2024-07-23 17:05:54.572626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:59.152 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:12:00.530 ====================================== 00:12:00.530 busy:2310630334 (cyc) 00:12:00.530 total_run_count: 267000 00:12:00.530 tsc_hz: 2300000000 (cyc) 00:12:00.530 ====================================== 00:12:00.530 poller_cost: 8654 (cyc), 3762 (nsec) 00:12:00.530 00:12:00.530 real 0m1.299s 00:12:00.530 user 0m1.150s 00:12:00.530 sys 0m0.143s 00:12:00.530 17:05:55 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:00.530 17:05:55 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:12:00.530 ************************************ 00:12:00.530 END TEST thread_poller_perf 00:12:00.530 ************************************ 00:12:00.530 17:05:55 thread -- common/autotest_common.sh@1142 -- # return 0 00:12:00.530 17:05:55 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:12:00.530 17:05:55 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:12:00.530 17:05:55 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:00.530 17:05:55 thread -- common/autotest_common.sh@10 -- # set +x 00:12:00.530 ************************************ 00:12:00.530 START TEST thread_poller_perf 00:12:00.530 ************************************ 00:12:00.530 17:05:55 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:12:00.530 [2024-07-23 17:05:55.777311] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:00.530 [2024-07-23 17:05:55.777376] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4079653 ] 00:12:00.530 [2024-07-23 17:05:55.893532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:00.530 [2024-07-23 17:05:55.947028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:00.530 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:12:01.905 ====================================== 00:12:01.905 busy:2302650930 (cyc) 00:12:01.905 total_run_count: 3495000 00:12:01.905 tsc_hz: 2300000000 (cyc) 00:12:01.905 ====================================== 00:12:01.905 poller_cost: 658 (cyc), 286 (nsec) 00:12:01.905 00:12:01.905 real 0m1.279s 00:12:01.905 user 0m1.141s 00:12:01.905 sys 0m0.132s 00:12:01.905 17:05:57 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:01.905 17:05:57 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:12:01.905 ************************************ 00:12:01.905 END TEST thread_poller_perf 00:12:01.905 ************************************ 00:12:01.905 17:05:57 thread -- common/autotest_common.sh@1142 -- # return 0 00:12:01.905 17:05:57 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:12:01.905 00:12:01.905 real 0m2.872s 00:12:01.905 user 0m2.388s 00:12:01.905 sys 0m0.496s 00:12:01.905 17:05:57 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:01.905 17:05:57 thread -- common/autotest_common.sh@10 -- # set +x 00:12:01.905 ************************************ 00:12:01.905 END TEST thread 00:12:01.905 ************************************ 00:12:01.905 17:05:57 -- common/autotest_common.sh@1142 -- # return 0 00:12:01.905 17:05:57 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:12:01.905 17:05:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:01.905 17:05:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:01.905 17:05:57 -- common/autotest_common.sh@10 -- # set +x 00:12:01.905 ************************************ 00:12:01.905 START TEST accel 00:12:01.905 ************************************ 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:12:01.906 * Looking for test storage... 00:12:01.906 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:12:01.906 17:05:57 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:12:01.906 17:05:57 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:12:01.906 17:05:57 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:12:01.906 17:05:57 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=4079909 00:12:01.906 17:05:57 accel -- accel/accel.sh@63 -- # waitforlisten 4079909 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@829 -- # '[' -z 4079909 ']' 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:01.906 17:05:57 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:12:01.906 17:05:57 accel -- accel/accel.sh@61 -- # build_accel_config 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:01.906 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:01.906 17:05:57 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:01.906 17:05:57 accel -- common/autotest_common.sh@10 -- # set +x 00:12:01.906 17:05:57 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:01.906 17:05:57 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:01.906 17:05:57 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:01.906 17:05:57 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:01.906 17:05:57 accel -- accel/accel.sh@40 -- # local IFS=, 00:12:01.906 17:05:57 accel -- accel/accel.sh@41 -- # jq -r . 00:12:02.165 [2024-07-23 17:05:57.341152] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:02.165 [2024-07-23 17:05:57.341223] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4079909 ] 00:12:02.165 [2024-07-23 17:05:57.474113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:02.165 [2024-07-23 17:05:57.526045] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:03.103 17:05:58 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:03.103 17:05:58 accel -- common/autotest_common.sh@862 -- # return 0 00:12:03.103 17:05:58 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:12:03.103 17:05:58 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:12:03.103 17:05:58 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:12:03.103 17:05:58 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:12:03.103 17:05:58 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:12:03.103 17:05:58 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:12:03.103 17:05:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:03.103 17:05:58 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:12:03.103 17:05:58 accel -- common/autotest_common.sh@10 -- # set +x 00:12:03.103 17:05:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.103 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.103 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.103 17:05:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:03.104 17:05:58 accel -- accel/accel.sh@72 -- # IFS== 00:12:03.104 17:05:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:03.104 17:05:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:03.104 17:05:58 accel -- accel/accel.sh@75 -- # killprocess 4079909 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@948 -- # '[' -z 4079909 ']' 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@952 -- # kill -0 4079909 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@953 -- # uname 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4079909 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4079909' 00:12:03.104 killing process with pid 4079909 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@967 -- # kill 4079909 00:12:03.104 17:05:58 accel -- common/autotest_common.sh@972 -- # wait 4079909 00:12:03.363 17:05:58 accel -- accel/accel.sh@76 -- # trap - ERR 00:12:03.363 17:05:58 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:12:03.363 17:05:58 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:03.363 17:05:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:03.363 17:05:58 accel -- common/autotest_common.sh@10 -- # set +x 00:12:03.363 17:05:58 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:12:03.363 17:05:58 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:12:03.622 17:05:58 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:03.622 17:05:58 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:12:03.622 17:05:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:03.622 17:05:58 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:12:03.622 17:05:58 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:03.622 17:05:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:03.622 17:05:58 accel -- common/autotest_common.sh@10 -- # set +x 00:12:03.622 ************************************ 00:12:03.622 START TEST accel_missing_filename 00:12:03.622 ************************************ 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:12:03.622 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:03.623 17:05:58 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:12:03.623 17:05:58 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:12:03.623 [2024-07-23 17:05:58.950213] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:03.623 [2024-07-23 17:05:58.950341] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4080129 ] 00:12:03.882 [2024-07-23 17:05:59.150353] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:03.882 [2024-07-23 17:05:59.208290] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:03.882 [2024-07-23 17:05:59.277633] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:04.142 [2024-07-23 17:05:59.351702] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:12:04.142 A filename is required. 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:04.142 00:12:04.142 real 0m0.532s 00:12:04.142 user 0m0.287s 00:12:04.142 sys 0m0.275s 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:04.142 17:05:59 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:12:04.142 ************************************ 00:12:04.142 END TEST accel_missing_filename 00:12:04.142 ************************************ 00:12:04.142 17:05:59 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:04.142 17:05:59 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:04.142 17:05:59 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:12:04.142 17:05:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:04.142 17:05:59 accel -- common/autotest_common.sh@10 -- # set +x 00:12:04.142 ************************************ 00:12:04.142 START TEST accel_compress_verify 00:12:04.142 ************************************ 00:12:04.142 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:04.142 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:12:04.142 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:04.142 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:12:04.143 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:04.143 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:12:04.143 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:04.143 17:05:59 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:12:04.143 17:05:59 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:12:04.402 [2024-07-23 17:05:59.564712] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:04.402 [2024-07-23 17:05:59.564844] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4080163 ] 00:12:04.402 [2024-07-23 17:05:59.765490] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:04.402 [2024-07-23 17:05:59.821149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:04.662 [2024-07-23 17:05:59.883985] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:04.662 [2024-07-23 17:05:59.947203] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:12:04.662 00:12:04.662 Compression does not support the verify option, aborting. 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:04.662 00:12:04.662 real 0m0.509s 00:12:04.662 user 0m0.277s 00:12:04.662 sys 0m0.257s 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:04.662 17:06:00 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:12:04.662 ************************************ 00:12:04.662 END TEST accel_compress_verify 00:12:04.662 ************************************ 00:12:04.662 17:06:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:04.662 17:06:00 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:12:04.662 17:06:00 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:04.662 17:06:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:04.662 17:06:00 accel -- common/autotest_common.sh@10 -- # set +x 00:12:04.921 ************************************ 00:12:04.922 START TEST accel_wrong_workload 00:12:04.922 ************************************ 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:12:04.922 17:06:00 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:12:04.922 Unsupported workload type: foobar 00:12:04.922 [2024-07-23 17:06:00.145110] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:12:04.922 accel_perf options: 00:12:04.922 [-h help message] 00:12:04.922 [-q queue depth per core] 00:12:04.922 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:12:04.922 [-T number of threads per core 00:12:04.922 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:12:04.922 [-t time in seconds] 00:12:04.922 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:12:04.922 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:12:04.922 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:12:04.922 [-l for compress/decompress workloads, name of uncompressed input file 00:12:04.922 [-S for crc32c workload, use this seed value (default 0) 00:12:04.922 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:12:04.922 [-f for fill workload, use this BYTE value (default 255) 00:12:04.922 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:12:04.922 [-y verify result if this switch is on] 00:12:04.922 [-a tasks to allocate per core (default: same value as -q)] 00:12:04.922 Can be used to spread operations across a wider range of memory. 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:04.922 00:12:04.922 real 0m0.044s 00:12:04.922 user 0m0.025s 00:12:04.922 sys 0m0.019s 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:04.922 17:06:00 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:12:04.922 ************************************ 00:12:04.922 END TEST accel_wrong_workload 00:12:04.922 ************************************ 00:12:04.922 Error: writing output failed: Broken pipe 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:04.922 17:06:00 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@10 -- # set +x 00:12:04.922 ************************************ 00:12:04.922 START TEST accel_negative_buffers 00:12:04.922 ************************************ 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:12:04.922 17:06:00 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:12:04.922 -x option must be non-negative. 00:12:04.922 [2024-07-23 17:06:00.266542] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:12:04.922 accel_perf options: 00:12:04.922 [-h help message] 00:12:04.922 [-q queue depth per core] 00:12:04.922 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:12:04.922 [-T number of threads per core 00:12:04.922 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:12:04.922 [-t time in seconds] 00:12:04.922 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:12:04.922 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:12:04.922 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:12:04.922 [-l for compress/decompress workloads, name of uncompressed input file 00:12:04.922 [-S for crc32c workload, use this seed value (default 0) 00:12:04.922 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:12:04.922 [-f for fill workload, use this BYTE value (default 255) 00:12:04.922 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:12:04.922 [-y verify result if this switch is on] 00:12:04.922 [-a tasks to allocate per core (default: same value as -q)] 00:12:04.922 Can be used to spread operations across a wider range of memory. 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:04.922 00:12:04.922 real 0m0.044s 00:12:04.922 user 0m0.024s 00:12:04.922 sys 0m0.020s 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:04.922 17:06:00 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:12:04.922 ************************************ 00:12:04.922 END TEST accel_negative_buffers 00:12:04.922 ************************************ 00:12:04.922 Error: writing output failed: Broken pipe 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:04.922 17:06:00 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:04.922 17:06:00 accel -- common/autotest_common.sh@10 -- # set +x 00:12:05.182 ************************************ 00:12:05.182 START TEST accel_crc32c 00:12:05.182 ************************************ 00:12:05.182 17:06:00 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:12:05.182 17:06:00 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:12:05.182 [2024-07-23 17:06:00.391704] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:05.182 [2024-07-23 17:06:00.391770] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4080409 ] 00:12:05.182 [2024-07-23 17:06:00.524257] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.182 [2024-07-23 17:06:00.582941] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.441 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:05.442 17:06:00 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.819 17:06:01 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:01 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:01 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:01 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:06.820 17:06:01 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:12:06.820 17:06:01 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:06.820 00:12:06.820 real 0m1.463s 00:12:06.820 user 0m1.253s 00:12:06.820 sys 0m0.208s 00:12:06.820 17:06:01 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:06.820 17:06:01 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:12:06.820 ************************************ 00:12:06.820 END TEST accel_crc32c 00:12:06.820 ************************************ 00:12:06.820 17:06:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:06.820 17:06:01 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:12:06.820 17:06:01 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:12:06.820 17:06:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:06.820 17:06:01 accel -- common/autotest_common.sh@10 -- # set +x 00:12:06.820 ************************************ 00:12:06.820 START TEST accel_crc32c_C2 00:12:06.820 ************************************ 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:12:06.820 17:06:01 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:12:06.820 [2024-07-23 17:06:01.936878] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:06.820 [2024-07-23 17:06:01.936953] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4080687 ] 00:12:06.820 [2024-07-23 17:06:02.050279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:06.820 [2024-07-23 17:06:02.107066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:06.820 17:06:02 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:08.197 00:12:08.197 real 0m1.434s 00:12:08.197 user 0m1.265s 00:12:08.197 sys 0m0.174s 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:08.197 17:06:03 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:12:08.197 ************************************ 00:12:08.197 END TEST accel_crc32c_C2 00:12:08.197 ************************************ 00:12:08.197 17:06:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:08.197 17:06:03 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:12:08.197 17:06:03 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:08.197 17:06:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:08.197 17:06:03 accel -- common/autotest_common.sh@10 -- # set +x 00:12:08.197 ************************************ 00:12:08.197 START TEST accel_copy 00:12:08.197 ************************************ 00:12:08.197 17:06:03 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:12:08.197 17:06:03 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:12:08.197 [2024-07-23 17:06:03.449403] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:08.197 [2024-07-23 17:06:03.449467] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4080901 ] 00:12:08.197 [2024-07-23 17:06:03.580878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:08.457 [2024-07-23 17:06:03.635382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:08.457 17:06:03 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:12:09.877 17:06:04 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:09.877 00:12:09.877 real 0m1.441s 00:12:09.877 user 0m1.248s 00:12:09.877 sys 0m0.195s 00:12:09.877 17:06:04 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:09.877 17:06:04 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:12:09.877 ************************************ 00:12:09.877 END TEST accel_copy 00:12:09.877 ************************************ 00:12:09.877 17:06:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:09.877 17:06:04 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:12:09.877 17:06:04 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:09.877 17:06:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:09.877 17:06:04 accel -- common/autotest_common.sh@10 -- # set +x 00:12:09.877 ************************************ 00:12:09.877 START TEST accel_fill 00:12:09.877 ************************************ 00:12:09.877 17:06:04 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:12:09.877 17:06:04 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:12:09.877 [2024-07-23 17:06:04.974783] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:09.877 [2024-07-23 17:06:04.974845] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4081288 ] 00:12:09.877 [2024-07-23 17:06:05.105630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:09.877 [2024-07-23 17:06:05.160121] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.877 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:09.878 17:06:05 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:09.878 17:06:05 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:09.878 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:09.878 17:06:05 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:12:11.253 17:06:06 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:11.253 00:12:11.253 real 0m1.441s 00:12:11.253 user 0m1.251s 00:12:11.253 sys 0m0.195s 00:12:11.253 17:06:06 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:11.253 17:06:06 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:12:11.253 ************************************ 00:12:11.253 END TEST accel_fill 00:12:11.253 ************************************ 00:12:11.253 17:06:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:11.253 17:06:06 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:12:11.253 17:06:06 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:11.253 17:06:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:11.253 17:06:06 accel -- common/autotest_common.sh@10 -- # set +x 00:12:11.253 ************************************ 00:12:11.253 START TEST accel_copy_crc32c 00:12:11.253 ************************************ 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:12:11.253 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:12:11.253 [2024-07-23 17:06:06.508201] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:11.253 [2024-07-23 17:06:06.508329] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4081785 ] 00:12:11.512 [2024-07-23 17:06:06.707521] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:11.512 [2024-07-23 17:06:06.761942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:12:11.512 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:11.513 17:06:06 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:12:12.888 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:12.889 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:12:12.889 17:06:07 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:12.889 00:12:12.889 real 0m1.514s 00:12:12.889 user 0m1.278s 00:12:12.889 sys 0m0.239s 00:12:12.889 17:06:07 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:12.889 17:06:07 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:12:12.889 ************************************ 00:12:12.889 END TEST accel_copy_crc32c 00:12:12.889 ************************************ 00:12:12.889 17:06:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:12.889 17:06:08 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:12:12.889 17:06:08 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:12:12.889 17:06:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:12.889 17:06:08 accel -- common/autotest_common.sh@10 -- # set +x 00:12:12.889 ************************************ 00:12:12.889 START TEST accel_copy_crc32c_C2 00:12:12.889 ************************************ 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:12:12.889 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:12:12.889 [2024-07-23 17:06:08.092291] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:12.889 [2024-07-23 17:06:08.092351] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4082050 ] 00:12:12.889 [2024-07-23 17:06:08.222498] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:12.889 [2024-07-23 17:06:08.275809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:13.148 17:06:08 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:14.525 00:12:14.525 real 0m1.457s 00:12:14.525 user 0m1.256s 00:12:14.525 sys 0m0.202s 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:14.525 17:06:09 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:12:14.525 ************************************ 00:12:14.525 END TEST accel_copy_crc32c_C2 00:12:14.525 ************************************ 00:12:14.525 17:06:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:14.525 17:06:09 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:12:14.525 17:06:09 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:14.525 17:06:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:14.525 17:06:09 accel -- common/autotest_common.sh@10 -- # set +x 00:12:14.525 ************************************ 00:12:14.525 START TEST accel_dualcast 00:12:14.525 ************************************ 00:12:14.525 17:06:09 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:12:14.525 17:06:09 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:12:14.525 [2024-07-23 17:06:09.645760] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:14.525 [2024-07-23 17:06:09.645889] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4082243 ] 00:12:14.525 [2024-07-23 17:06:09.846367] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:14.525 [2024-07-23 17:06:09.904279] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.784 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:14.785 17:06:09 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:12:15.720 17:06:11 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:15.720 00:12:15.720 real 0m1.535s 00:12:15.720 user 0m1.268s 00:12:15.720 sys 0m0.267s 00:12:15.720 17:06:11 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:15.720 17:06:11 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:12:15.720 ************************************ 00:12:15.720 END TEST accel_dualcast 00:12:15.720 ************************************ 00:12:15.979 17:06:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:15.979 17:06:11 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:12:15.979 17:06:11 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:15.979 17:06:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:15.979 17:06:11 accel -- common/autotest_common.sh@10 -- # set +x 00:12:15.979 ************************************ 00:12:15.979 START TEST accel_compare 00:12:15.979 ************************************ 00:12:15.979 17:06:11 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:12:15.979 17:06:11 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:12:15.979 [2024-07-23 17:06:11.260884] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:15.979 [2024-07-23 17:06:11.261022] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4082443 ] 00:12:16.239 [2024-07-23 17:06:11.461603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:16.239 [2024-07-23 17:06:11.520517] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:16.239 17:06:11 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:12:17.618 17:06:12 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:17.618 00:12:17.618 real 0m1.528s 00:12:17.618 user 0m1.281s 00:12:17.618 sys 0m0.252s 00:12:17.618 17:06:12 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:17.618 17:06:12 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:12:17.618 ************************************ 00:12:17.618 END TEST accel_compare 00:12:17.618 ************************************ 00:12:17.618 17:06:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:17.618 17:06:12 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:12:17.618 17:06:12 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:12:17.618 17:06:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:17.618 17:06:12 accel -- common/autotest_common.sh@10 -- # set +x 00:12:17.618 ************************************ 00:12:17.618 START TEST accel_xor 00:12:17.618 ************************************ 00:12:17.618 17:06:12 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:12:17.618 17:06:12 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:12:17.618 [2024-07-23 17:06:12.856205] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:17.618 [2024-07-23 17:06:12.856268] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4082646 ] 00:12:17.618 [2024-07-23 17:06:12.975573] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:17.618 [2024-07-23 17:06:13.025131] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:17.877 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.877 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.877 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:17.878 17:06:13 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.258 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:19.259 00:12:19.259 real 0m1.435s 00:12:19.259 user 0m1.245s 00:12:19.259 sys 0m0.188s 00:12:19.259 17:06:14 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:19.259 17:06:14 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:12:19.259 ************************************ 00:12:19.259 END TEST accel_xor 00:12:19.259 ************************************ 00:12:19.259 17:06:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:19.259 17:06:14 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:12:19.259 17:06:14 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:12:19.259 17:06:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:19.259 17:06:14 accel -- common/autotest_common.sh@10 -- # set +x 00:12:19.259 ************************************ 00:12:19.259 START TEST accel_xor 00:12:19.259 ************************************ 00:12:19.259 17:06:14 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:12:19.259 [2024-07-23 17:06:14.374910] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:19.259 [2024-07-23 17:06:14.374978] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4082858 ] 00:12:19.259 [2024-07-23 17:06:14.504921] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:19.259 [2024-07-23 17:06:14.557285] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:19.259 17:06:14 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.637 17:06:15 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:12:20.638 17:06:15 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:12:20.638 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:12:20.638 17:06:15 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:12:20.638 17:06:15 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:20.638 17:06:15 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:12:20.638 17:06:15 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:20.638 00:12:20.638 real 0m1.449s 00:12:20.638 user 0m1.258s 00:12:20.638 sys 0m0.192s 00:12:20.638 17:06:15 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:20.638 17:06:15 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:12:20.638 ************************************ 00:12:20.638 END TEST accel_xor 00:12:20.638 ************************************ 00:12:20.638 17:06:15 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:20.638 17:06:15 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:12:20.638 17:06:15 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:12:20.638 17:06:15 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:20.638 17:06:15 accel -- common/autotest_common.sh@10 -- # set +x 00:12:20.638 ************************************ 00:12:20.638 START TEST accel_dif_verify 00:12:20.638 ************************************ 00:12:20.638 17:06:15 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:12:20.638 17:06:15 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:12:20.638 [2024-07-23 17:06:15.898839] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:20.638 [2024-07-23 17:06:15.898908] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4083144 ] 00:12:20.638 [2024-07-23 17:06:16.027331] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:20.897 [2024-07-23 17:06:16.080803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.897 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:20.898 17:06:16 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:12:22.276 17:06:17 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:22.276 00:12:22.276 real 0m1.444s 00:12:22.276 user 0m1.257s 00:12:22.276 sys 0m0.191s 00:12:22.276 17:06:17 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:22.276 17:06:17 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:12:22.276 ************************************ 00:12:22.276 END TEST accel_dif_verify 00:12:22.276 ************************************ 00:12:22.276 17:06:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:22.276 17:06:17 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:12:22.276 17:06:17 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:12:22.276 17:06:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:22.276 17:06:17 accel -- common/autotest_common.sh@10 -- # set +x 00:12:22.276 ************************************ 00:12:22.276 START TEST accel_dif_generate 00:12:22.276 ************************************ 00:12:22.276 17:06:17 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:12:22.276 17:06:17 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:12:22.276 17:06:17 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:12:22.276 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:12:22.277 [2024-07-23 17:06:17.428127] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:22.277 [2024-07-23 17:06:17.428188] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4083392 ] 00:12:22.277 [2024-07-23 17:06:17.558948] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:22.277 [2024-07-23 17:06:17.612052] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:22.277 17:06:17 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:12:23.655 17:06:18 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:23.655 00:12:23.655 real 0m1.450s 00:12:23.655 user 0m1.251s 00:12:23.655 sys 0m0.205s 00:12:23.655 17:06:18 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:23.655 17:06:18 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:12:23.655 ************************************ 00:12:23.655 END TEST accel_dif_generate 00:12:23.655 ************************************ 00:12:23.655 17:06:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:23.655 17:06:18 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:12:23.655 17:06:18 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:12:23.655 17:06:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:23.655 17:06:18 accel -- common/autotest_common.sh@10 -- # set +x 00:12:23.655 ************************************ 00:12:23.655 START TEST accel_dif_generate_copy 00:12:23.655 ************************************ 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:12:23.655 17:06:18 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:12:23.655 [2024-07-23 17:06:18.961289] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:23.655 [2024-07-23 17:06:18.961350] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4083591 ] 00:12:23.914 [2024-07-23 17:06:19.091863] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:23.914 [2024-07-23 17:06:19.145487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.914 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:23.915 17:06:19 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:25.292 00:12:25.292 real 0m1.447s 00:12:25.292 user 0m1.253s 00:12:25.292 sys 0m0.200s 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:25.292 17:06:20 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:12:25.292 ************************************ 00:12:25.292 END TEST accel_dif_generate_copy 00:12:25.292 ************************************ 00:12:25.292 17:06:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:25.292 17:06:20 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:12:25.292 17:06:20 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:25.292 17:06:20 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:12:25.292 17:06:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:25.292 17:06:20 accel -- common/autotest_common.sh@10 -- # set +x 00:12:25.292 ************************************ 00:12:25.292 START TEST accel_comp 00:12:25.292 ************************************ 00:12:25.292 17:06:20 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:12:25.292 17:06:20 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:12:25.292 [2024-07-23 17:06:20.495614] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:25.292 [2024-07-23 17:06:20.495679] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4083784 ] 00:12:25.292 [2024-07-23 17:06:20.610257] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:25.292 [2024-07-23 17:06:20.666436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:25.551 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.551 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.551 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.551 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.551 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:25.552 17:06:20 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:26.487 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:12:26.747 17:06:21 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:12:26.747 17:06:21 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:26.747 17:06:21 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:12:26.747 17:06:21 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:26.747 00:12:26.747 real 0m1.449s 00:12:26.747 user 0m1.257s 00:12:26.747 sys 0m0.188s 00:12:26.747 17:06:21 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:26.747 17:06:21 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:12:26.747 ************************************ 00:12:26.747 END TEST accel_comp 00:12:26.747 ************************************ 00:12:26.747 17:06:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:26.747 17:06:21 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:26.747 17:06:21 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:12:26.747 17:06:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:26.747 17:06:21 accel -- common/autotest_common.sh@10 -- # set +x 00:12:26.747 ************************************ 00:12:26.747 START TEST accel_decomp 00:12:26.747 ************************************ 00:12:26.747 17:06:21 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:26.747 17:06:21 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:12:26.747 17:06:22 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:12:26.747 [2024-07-23 17:06:22.029807] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:26.747 [2024-07-23 17:06:22.029877] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4083984 ] 00:12:26.747 [2024-07-23 17:06:22.160881] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:27.007 [2024-07-23 17:06:22.215909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:27.007 17:06:22 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:28.386 17:06:23 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:28.386 00:12:28.386 real 0m1.450s 00:12:28.386 user 0m1.252s 00:12:28.386 sys 0m0.202s 00:12:28.386 17:06:23 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:28.386 17:06:23 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:12:28.386 ************************************ 00:12:28.386 END TEST accel_decomp 00:12:28.386 ************************************ 00:12:28.386 17:06:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:28.386 17:06:23 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:28.386 17:06:23 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:12:28.386 17:06:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:28.386 17:06:23 accel -- common/autotest_common.sh@10 -- # set +x 00:12:28.386 ************************************ 00:12:28.386 START TEST accel_decomp_full 00:12:28.386 ************************************ 00:12:28.386 17:06:23 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:12:28.386 [2024-07-23 17:06:23.554266] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:28.386 [2024-07-23 17:06:23.554328] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4084181 ] 00:12:28.386 [2024-07-23 17:06:23.682365] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:28.386 [2024-07-23 17:06:23.732334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.386 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.387 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.387 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.387 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.387 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.387 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:28.646 17:06:23 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.628 17:06:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:29.628 17:06:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:29.628 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:29.628 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.628 17:06:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:29.628 17:06:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:29.629 17:06:24 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:29.629 00:12:29.629 real 0m1.439s 00:12:29.629 user 0m1.245s 00:12:29.629 sys 0m0.196s 00:12:29.629 17:06:24 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:29.629 17:06:24 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:12:29.629 ************************************ 00:12:29.629 END TEST accel_decomp_full 00:12:29.629 ************************************ 00:12:29.629 17:06:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:29.629 17:06:25 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:29.629 17:06:25 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:12:29.629 17:06:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:29.629 17:06:25 accel -- common/autotest_common.sh@10 -- # set +x 00:12:29.889 ************************************ 00:12:29.889 START TEST accel_decomp_mcore 00:12:29.889 ************************************ 00:12:29.889 17:06:25 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:29.889 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:12:29.889 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:12:29.889 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:12:29.890 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:12:29.890 [2024-07-23 17:06:25.078912] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:29.890 [2024-07-23 17:06:25.078981] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4084383 ] 00:12:29.890 [2024-07-23 17:06:25.199471] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:12:29.890 [2024-07-23 17:06:25.256973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:29.890 [2024-07-23 17:06:25.257073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:12:29.890 [2024-07-23 17:06:25.257173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:29.890 [2024-07-23 17:06:25.257173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.149 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:30.150 17:06:25 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:31.088 00:12:31.088 real 0m1.449s 00:12:31.088 user 0m4.685s 00:12:31.088 sys 0m0.201s 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:31.088 17:06:26 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:12:31.088 ************************************ 00:12:31.088 END TEST accel_decomp_mcore 00:12:31.088 ************************************ 00:12:31.348 17:06:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:31.348 17:06:26 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:31.348 17:06:26 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:31.348 17:06:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:31.348 17:06:26 accel -- common/autotest_common.sh@10 -- # set +x 00:12:31.348 ************************************ 00:12:31.348 START TEST accel_decomp_full_mcore 00:12:31.348 ************************************ 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:12:31.348 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:12:31.348 [2024-07-23 17:06:26.609443] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:31.348 [2024-07-23 17:06:26.609504] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4084600 ] 00:12:31.348 [2024-07-23 17:06:26.740409] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:12:31.608 [2024-07-23 17:06:26.797933] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:31.608 [2024-07-23 17:06:26.797995] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:12:31.608 [2024-07-23 17:06:26.798095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:31.608 [2024-07-23 17:06:26.798094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.608 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:31.609 17:06:26 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:32.987 00:12:32.987 real 0m1.498s 00:12:32.987 user 0m4.823s 00:12:32.987 sys 0m0.218s 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:32.987 17:06:28 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:12:32.987 ************************************ 00:12:32.987 END TEST accel_decomp_full_mcore 00:12:32.987 ************************************ 00:12:32.987 17:06:28 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:32.987 17:06:28 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:32.987 17:06:28 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:12:32.987 17:06:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:32.987 17:06:28 accel -- common/autotest_common.sh@10 -- # set +x 00:12:32.987 ************************************ 00:12:32.987 START TEST accel_decomp_mthread 00:12:32.987 ************************************ 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:12:32.987 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:12:32.987 [2024-07-23 17:06:28.190511] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:32.987 [2024-07-23 17:06:28.190576] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4084904 ] 00:12:32.987 [2024-07-23 17:06:28.321523] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:32.987 [2024-07-23 17:06:28.374843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.246 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:33.247 17:06:28 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.184 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:34.444 00:12:34.444 real 0m1.458s 00:12:34.444 user 0m1.260s 00:12:34.444 sys 0m0.203s 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:34.444 17:06:29 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:12:34.444 ************************************ 00:12:34.444 END TEST accel_decomp_mthread 00:12:34.444 ************************************ 00:12:34.444 17:06:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:34.444 17:06:29 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:34.444 17:06:29 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:34.444 17:06:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:34.444 17:06:29 accel -- common/autotest_common.sh@10 -- # set +x 00:12:34.444 ************************************ 00:12:34.444 START TEST accel_decomp_full_mthread 00:12:34.444 ************************************ 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:12:34.444 17:06:29 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:12:34.444 [2024-07-23 17:06:29.741362] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:34.444 [2024-07-23 17:06:29.741489] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4085134 ] 00:12:34.704 [2024-07-23 17:06:29.939933] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:34.704 [2024-07-23 17:06:29.997761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.704 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:34.705 17:06:30 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:12:36.081 00:12:36.081 real 0m1.562s 00:12:36.081 user 0m1.313s 00:12:36.081 sys 0m0.254s 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:36.081 17:06:31 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:12:36.081 ************************************ 00:12:36.081 END TEST accel_decomp_full_mthread 00:12:36.081 ************************************ 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:36.081 17:06:31 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:12:36.081 17:06:31 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:12:36.081 17:06:31 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:12:36.081 17:06:31 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:12:36.081 17:06:31 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=4085332 00:12:36.081 17:06:31 accel -- accel/accel.sh@63 -- # waitforlisten 4085332 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@829 -- # '[' -z 4085332 ']' 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:36.081 17:06:31 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:36.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:36.081 17:06:31 accel -- accel/accel.sh@61 -- # build_accel_config 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:36.081 17:06:31 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:36.081 17:06:31 accel -- common/autotest_common.sh@10 -- # set +x 00:12:36.081 17:06:31 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:36.081 17:06:31 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:36.081 17:06:31 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:36.081 17:06:31 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:36.081 17:06:31 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:36.081 17:06:31 accel -- accel/accel.sh@40 -- # local IFS=, 00:12:36.081 17:06:31 accel -- accel/accel.sh@41 -- # jq -r . 00:12:36.081 [2024-07-23 17:06:31.368441] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:36.081 [2024-07-23 17:06:31.368510] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4085332 ] 00:12:36.081 [2024-07-23 17:06:31.499208] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:36.340 [2024-07-23 17:06:31.550868] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:36.908 [2024-07-23 17:06:32.211116] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@862 -- # return 0 00:12:37.167 17:06:32 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:12:37.167 17:06:32 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:12:37.167 17:06:32 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:12:37.167 17:06:32 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:12:37.167 17:06:32 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:12:37.167 17:06:32 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:12:37.167 17:06:32 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@10 -- # set +x 00:12:37.167 17:06:32 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:37.167 "method": "compressdev_scan_accel_module", 00:12:37.167 17:06:32 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:12:37.167 17:06:32 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@10 -- # set +x 00:12:37.167 17:06:32 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:12:37.167 17:06:32 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.426 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.426 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.426 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.427 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.427 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.427 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.427 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.427 17:06:32 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:12:37.427 17:06:32 accel -- accel/accel.sh@72 -- # IFS== 00:12:37.427 17:06:32 accel -- accel/accel.sh@72 -- # read -r opc module 00:12:37.427 17:06:32 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:12:37.427 17:06:32 accel -- accel/accel.sh@75 -- # killprocess 4085332 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@948 -- # '[' -z 4085332 ']' 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@952 -- # kill -0 4085332 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@953 -- # uname 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4085332 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4085332' 00:12:37.427 killing process with pid 4085332 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@967 -- # kill 4085332 00:12:37.427 17:06:32 accel -- common/autotest_common.sh@972 -- # wait 4085332 00:12:37.686 17:06:33 accel -- accel/accel.sh@76 -- # trap - ERR 00:12:37.686 17:06:33 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:37.686 17:06:33 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:12:37.686 17:06:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:37.686 17:06:33 accel -- common/autotest_common.sh@10 -- # set +x 00:12:37.686 ************************************ 00:12:37.686 START TEST accel_cdev_comp 00:12:37.686 ************************************ 00:12:37.686 17:06:33 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:37.686 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:12:37.686 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:12:37.687 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:12:37.687 [2024-07-23 17:06:33.083014] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:37.687 [2024-07-23 17:06:33.083083] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4085530 ] 00:12:37.946 [2024-07-23 17:06:33.214450] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:37.946 [2024-07-23 17:06:33.271978] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:38.877 [2024-07-23 17:06:33.939419] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:38.877 [2024-07-23 17:06:33.942079] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15c9480 PMD being used: compress_qat 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.877 [2024-07-23 17:06:33.946155] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15cb3b0 PMD being used: compress_qat 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.877 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:38.878 17:06:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:12:39.811 17:06:35 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:39.811 00:12:39.811 real 0m2.064s 00:12:39.811 user 0m1.471s 00:12:39.811 sys 0m0.592s 00:12:39.811 17:06:35 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:39.811 17:06:35 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:12:39.811 ************************************ 00:12:39.811 END TEST accel_cdev_comp 00:12:39.811 ************************************ 00:12:39.811 17:06:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:39.811 17:06:35 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:39.811 17:06:35 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:12:39.811 17:06:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:39.811 17:06:35 accel -- common/autotest_common.sh@10 -- # set +x 00:12:39.811 ************************************ 00:12:39.811 START TEST accel_cdev_decomp 00:12:39.811 ************************************ 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:12:39.811 17:06:35 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:12:39.811 [2024-07-23 17:06:35.223329] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:39.811 [2024-07-23 17:06:35.223391] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4085887 ] 00:12:40.069 [2024-07-23 17:06:35.352141] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:40.069 [2024-07-23 17:06:35.402737] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:41.005 [2024-07-23 17:06:36.059632] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:41.005 [2024-07-23 17:06:36.062252] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8d8480 PMD being used: compress_qat 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 [2024-07-23 17:06:36.066402] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8da3b0 PMD being used: compress_qat 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.005 17:06:36 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:41.941 00:12:41.941 real 0m2.038s 00:12:41.941 user 0m1.463s 00:12:41.941 sys 0m0.577s 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:41.941 17:06:37 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:12:41.941 ************************************ 00:12:41.941 END TEST accel_cdev_decomp 00:12:41.941 ************************************ 00:12:41.941 17:06:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:41.941 17:06:37 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:41.941 17:06:37 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:12:41.941 17:06:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:41.941 17:06:37 accel -- common/autotest_common.sh@10 -- # set +x 00:12:41.941 ************************************ 00:12:41.941 START TEST accel_cdev_decomp_full 00:12:41.941 ************************************ 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:12:41.941 17:06:37 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:12:41.941 [2024-07-23 17:06:37.348754] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:41.941 [2024-07-23 17:06:37.348816] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4086100 ] 00:12:42.201 [2024-07-23 17:06:37.480116] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:42.201 [2024-07-23 17:06:37.533793] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:43.138 [2024-07-23 17:06:38.196080] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:43.138 [2024-07-23 17:06:38.198643] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x227a480 PMD being used: compress_qat 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 [2024-07-23 17:06:38.201713] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x227a520 PMD being used: compress_qat 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:43.138 17:06:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:44.075 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:44.076 00:12:44.076 real 0m2.053s 00:12:44.076 user 0m1.479s 00:12:44.076 sys 0m0.568s 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:44.076 17:06:39 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:12:44.076 ************************************ 00:12:44.076 END TEST accel_cdev_decomp_full 00:12:44.076 ************************************ 00:12:44.076 17:06:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:44.076 17:06:39 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:44.076 17:06:39 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:12:44.076 17:06:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:44.076 17:06:39 accel -- common/autotest_common.sh@10 -- # set +x 00:12:44.076 ************************************ 00:12:44.076 START TEST accel_cdev_decomp_mcore 00:12:44.076 ************************************ 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:12:44.076 17:06:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:12:44.076 [2024-07-23 17:06:39.484382] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:44.076 [2024-07-23 17:06:39.484444] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4086463 ] 00:12:44.335 [2024-07-23 17:06:39.613444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:12:44.335 [2024-07-23 17:06:39.668793] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:44.335 [2024-07-23 17:06:39.668917] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:12:44.335 [2024-07-23 17:06:39.668986] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:44.335 [2024-07-23 17:06:39.668985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:12:45.275 [2024-07-23 17:06:40.337168] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:45.275 [2024-07-23 17:06:40.339801] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14b5a70 PMD being used: compress_qat 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:12:45.275 [2024-07-23 17:06:40.345560] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb1bc19b8b0 PMD being used: compress_qat 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 [2024-07-23 17:06:40.347042] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14b7eb0 PMD being used: compress_qat 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 [2024-07-23 17:06:40.350142] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb1b419b8b0 PMD being used: compress_qat 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 [2024-07-23 17:06:40.350461] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb1ac19b8b0 PMD being used: compress_qat 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:45.275 17:06:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:46.213 00:12:46.213 real 0m2.084s 00:12:46.213 user 0m6.785s 00:12:46.213 sys 0m0.605s 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:46.213 17:06:41 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:12:46.214 ************************************ 00:12:46.214 END TEST accel_cdev_decomp_mcore 00:12:46.214 ************************************ 00:12:46.214 17:06:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:46.214 17:06:41 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:46.214 17:06:41 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:46.214 17:06:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:46.214 17:06:41 accel -- common/autotest_common.sh@10 -- # set +x 00:12:46.214 ************************************ 00:12:46.214 START TEST accel_cdev_decomp_full_mcore 00:12:46.214 ************************************ 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:12:46.214 17:06:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:12:46.473 [2024-07-23 17:06:41.650215] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:46.473 [2024-07-23 17:06:41.650279] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4086712 ] 00:12:46.473 [2024-07-23 17:06:41.779916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:12:46.473 [2024-07-23 17:06:41.837910] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:46.473 [2024-07-23 17:06:41.837984] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:12:46.473 [2024-07-23 17:06:41.838088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:12:46.473 [2024-07-23 17:06:41.838090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:47.412 [2024-07-23 17:06:42.503598] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:47.412 [2024-07-23 17:06:42.506225] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x103fa70 PMD being used: compress_qat 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:12:47.412 [2024-07-23 17:06:42.511021] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc55819b8b0 PMD being used: compress_qat 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 [2024-07-23 17:06:42.512548] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x103f820 PMD being used: compress_qat 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 [2024-07-23 17:06:42.515766] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc55019b8b0 PMD being used: compress_qat 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 [2024-07-23 17:06:42.516066] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc54819b8b0 PMD being used: compress_qat 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:47.412 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:47.413 17:06:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.408 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:48.409 00:12:48.409 real 0m2.071s 00:12:48.409 user 0m6.730s 00:12:48.409 sys 0m0.605s 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:48.409 17:06:43 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:12:48.409 ************************************ 00:12:48.409 END TEST accel_cdev_decomp_full_mcore 00:12:48.409 ************************************ 00:12:48.409 17:06:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:48.409 17:06:43 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:48.409 17:06:43 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:12:48.409 17:06:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:48.409 17:06:43 accel -- common/autotest_common.sh@10 -- # set +x 00:12:48.409 ************************************ 00:12:48.409 START TEST accel_cdev_decomp_mthread 00:12:48.409 ************************************ 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:12:48.409 17:06:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:12:48.409 [2024-07-23 17:06:43.805052] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:48.409 [2024-07-23 17:06:43.805128] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4087035 ] 00:12:48.676 [2024-07-23 17:06:43.937539] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:48.676 [2024-07-23 17:06:43.996447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:49.244 [2024-07-23 17:06:44.661632] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:49.244 [2024-07-23 17:06:44.664239] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x9f2480 PMD being used: compress_qat 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 [2024-07-23 17:06:44.669005] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x9f47f0 PMD being used: compress_qat 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.504 [2024-07-23 17:06:44.671573] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x9f6790 PMD being used: compress_qat 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:12:49.504 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:49.505 17:06:44 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:50.443 00:12:50.443 real 0m2.070s 00:12:50.443 user 0m1.468s 00:12:50.443 sys 0m0.601s 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:50.443 17:06:45 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:12:50.443 ************************************ 00:12:50.443 END TEST accel_cdev_decomp_mthread 00:12:50.443 ************************************ 00:12:50.703 17:06:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:50.703 17:06:45 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:50.703 17:06:45 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:50.703 17:06:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:50.703 17:06:45 accel -- common/autotest_common.sh@10 -- # set +x 00:12:50.703 ************************************ 00:12:50.703 START TEST accel_cdev_decomp_full_mthread 00:12:50.703 ************************************ 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:12:50.703 17:06:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:12:50.703 [2024-07-23 17:06:45.970529] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:50.703 [2024-07-23 17:06:45.970659] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4087386 ] 00:12:50.963 [2024-07-23 17:06:46.166610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:50.963 [2024-07-23 17:06:46.220412] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:51.531 [2024-07-23 17:06:46.883760] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:12:51.531 [2024-07-23 17:06:46.886393] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2124480 PMD being used: compress_qat 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.531 [2024-07-23 17:06:46.890259] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2124230 PMD being used: compress_qat 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.531 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.532 [2024-07-23 17:06:46.893124] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24cd0f0 PMD being used: compress_qat 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:51.532 17:06:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:12:52.911 00:12:52.911 real 0m2.140s 00:12:52.911 user 0m1.507s 00:12:52.911 sys 0m0.635s 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:52.911 17:06:48 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:12:52.911 ************************************ 00:12:52.911 END TEST accel_cdev_decomp_full_mthread 00:12:52.911 ************************************ 00:12:52.911 17:06:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:52.911 17:06:48 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:12:52.911 17:06:48 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:12:52.911 17:06:48 accel -- accel/accel.sh@137 -- # build_accel_config 00:12:52.911 17:06:48 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:52.911 17:06:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:52.911 17:06:48 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:12:52.911 17:06:48 accel -- common/autotest_common.sh@10 -- # set +x 00:12:52.911 17:06:48 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:12:52.911 17:06:48 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:12:52.911 17:06:48 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:12:52.911 17:06:48 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:12:52.911 17:06:48 accel -- accel/accel.sh@40 -- # local IFS=, 00:12:52.911 17:06:48 accel -- accel/accel.sh@41 -- # jq -r . 00:12:52.911 ************************************ 00:12:52.911 START TEST accel_dif_functional_tests 00:12:52.911 ************************************ 00:12:52.911 17:06:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:12:52.911 [2024-07-23 17:06:48.198927] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:52.911 [2024-07-23 17:06:48.198984] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4087604 ] 00:12:52.911 [2024-07-23 17:06:48.331533] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:12:53.170 [2024-07-23 17:06:48.388662] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:53.170 [2024-07-23 17:06:48.388766] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:12:53.170 [2024-07-23 17:06:48.388767] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:53.170 00:12:53.170 00:12:53.170 CUnit - A unit testing framework for C - Version 2.1-3 00:12:53.170 http://cunit.sourceforge.net/ 00:12:53.170 00:12:53.170 00:12:53.170 Suite: accel_dif 00:12:53.170 Test: verify: DIF generated, GUARD check ...passed 00:12:53.170 Test: verify: DIF generated, APPTAG check ...passed 00:12:53.170 Test: verify: DIF generated, REFTAG check ...passed 00:12:53.170 Test: verify: DIF not generated, GUARD check ...[2024-07-23 17:06:48.485714] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:12:53.170 passed 00:12:53.170 Test: verify: DIF not generated, APPTAG check ...[2024-07-23 17:06:48.485787] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:12:53.170 passed 00:12:53.170 Test: verify: DIF not generated, REFTAG check ...[2024-07-23 17:06:48.485824] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:12:53.170 passed 00:12:53.170 Test: verify: APPTAG correct, APPTAG check ...passed 00:12:53.170 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-23 17:06:48.485906] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:12:53.170 passed 00:12:53.170 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:12:53.170 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:12:53.170 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:12:53.171 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-23 17:06:48.486076] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:12:53.171 passed 00:12:53.171 Test: verify copy: DIF generated, GUARD check ...passed 00:12:53.171 Test: verify copy: DIF generated, APPTAG check ...passed 00:12:53.171 Test: verify copy: DIF generated, REFTAG check ...passed 00:12:53.171 Test: verify copy: DIF not generated, GUARD check ...[2024-07-23 17:06:48.486255] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:12:53.171 passed 00:12:53.171 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-23 17:06:48.486293] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:12:53.171 passed 00:12:53.171 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-23 17:06:48.486333] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:12:53.171 passed 00:12:53.171 Test: generate copy: DIF generated, GUARD check ...passed 00:12:53.171 Test: generate copy: DIF generated, APTTAG check ...passed 00:12:53.171 Test: generate copy: DIF generated, REFTAG check ...passed 00:12:53.171 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:12:53.171 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:12:53.171 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:12:53.171 Test: generate copy: iovecs-len validate ...[2024-07-23 17:06:48.486613] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:12:53.171 passed 00:12:53.171 Test: generate copy: buffer alignment validate ...passed 00:12:53.171 00:12:53.171 Run Summary: Type Total Ran Passed Failed Inactive 00:12:53.171 suites 1 1 n/a 0 0 00:12:53.171 tests 26 26 26 0 0 00:12:53.171 asserts 115 115 115 0 n/a 00:12:53.171 00:12:53.171 Elapsed time = 0.003 seconds 00:12:53.430 00:12:53.430 real 0m0.547s 00:12:53.430 user 0m0.732s 00:12:53.430 sys 0m0.248s 00:12:53.430 17:06:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:53.430 17:06:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:12:53.430 ************************************ 00:12:53.430 END TEST accel_dif_functional_tests 00:12:53.430 ************************************ 00:12:53.430 17:06:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:12:53.430 00:12:53.430 real 0m51.579s 00:12:53.430 user 0m58.832s 00:12:53.430 sys 0m12.543s 00:12:53.430 17:06:48 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:53.430 17:06:48 accel -- common/autotest_common.sh@10 -- # set +x 00:12:53.430 ************************************ 00:12:53.430 END TEST accel 00:12:53.430 ************************************ 00:12:53.430 17:06:48 -- common/autotest_common.sh@1142 -- # return 0 00:12:53.430 17:06:48 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:12:53.430 17:06:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:53.430 17:06:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:53.430 17:06:48 -- common/autotest_common.sh@10 -- # set +x 00:12:53.430 ************************************ 00:12:53.430 START TEST accel_rpc 00:12:53.430 ************************************ 00:12:53.431 17:06:48 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:12:53.690 * Looking for test storage... 00:12:53.690 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:12:53.690 17:06:48 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:12:53.690 17:06:48 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=4087836 00:12:53.690 17:06:48 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 4087836 00:12:53.690 17:06:48 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:12:53.690 17:06:48 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 4087836 ']' 00:12:53.690 17:06:48 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:53.690 17:06:48 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:53.690 17:06:48 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:53.690 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:53.690 17:06:48 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:53.690 17:06:48 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:53.690 [2024-07-23 17:06:48.995339] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:53.690 [2024-07-23 17:06:48.995410] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4087836 ] 00:12:53.949 [2024-07-23 17:06:49.129691] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:53.949 [2024-07-23 17:06:49.179041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:54.517 17:06:49 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:54.517 17:06:49 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:12:54.517 17:06:49 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:12:54.517 17:06:49 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:12:54.517 17:06:49 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:12:54.517 17:06:49 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:12:54.517 17:06:49 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:12:54.517 17:06:49 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:54.517 17:06:49 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:54.517 17:06:49 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:54.775 ************************************ 00:12:54.775 START TEST accel_assign_opcode 00:12:54.775 ************************************ 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:12:54.775 [2024-07-23 17:06:49.969521] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:12:54.775 [2024-07-23 17:06:49.981547] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:54.775 17:06:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:12:54.776 17:06:50 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:55.035 software 00:12:55.035 00:12:55.035 real 0m0.255s 00:12:55.035 user 0m0.049s 00:12:55.035 sys 0m0.014s 00:12:55.035 17:06:50 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:55.035 17:06:50 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:12:55.035 ************************************ 00:12:55.035 END TEST accel_assign_opcode 00:12:55.035 ************************************ 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:12:55.035 17:06:50 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 4087836 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 4087836 ']' 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 4087836 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4087836 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4087836' 00:12:55.035 killing process with pid 4087836 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@967 -- # kill 4087836 00:12:55.035 17:06:50 accel_rpc -- common/autotest_common.sh@972 -- # wait 4087836 00:12:55.294 00:12:55.294 real 0m1.870s 00:12:55.294 user 0m1.935s 00:12:55.294 sys 0m0.613s 00:12:55.294 17:06:50 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:55.294 17:06:50 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:55.294 ************************************ 00:12:55.294 END TEST accel_rpc 00:12:55.294 ************************************ 00:12:55.552 17:06:50 -- common/autotest_common.sh@1142 -- # return 0 00:12:55.552 17:06:50 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:12:55.552 17:06:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:55.552 17:06:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:55.552 17:06:50 -- common/autotest_common.sh@10 -- # set +x 00:12:55.552 ************************************ 00:12:55.552 START TEST app_cmdline 00:12:55.552 ************************************ 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:12:55.552 * Looking for test storage... 00:12:55.552 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:12:55.552 17:06:50 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:12:55.552 17:06:50 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=4088093 00:12:55.552 17:06:50 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:12:55.552 17:06:50 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 4088093 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 4088093 ']' 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:55.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:55.552 17:06:50 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:12:55.552 [2024-07-23 17:06:50.955800] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:55.552 [2024-07-23 17:06:50.955872] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4088093 ] 00:12:55.811 [2024-07-23 17:06:51.088049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.811 [2024-07-23 17:06:51.141152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:56.744 17:06:51 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:56.744 17:06:51 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:12:56.744 17:06:51 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:12:56.744 { 00:12:56.744 "version": "SPDK v24.09-pre git sha1 b8378f94e", 00:12:56.744 "fields": { 00:12:56.744 "major": 24, 00:12:56.744 "minor": 9, 00:12:56.744 "patch": 0, 00:12:56.744 "suffix": "-pre", 00:12:56.744 "commit": "b8378f94e" 00:12:56.744 } 00:12:56.744 } 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:12:56.744 17:06:52 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:56.744 17:06:52 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:12:56.744 17:06:52 app_cmdline -- app/cmdline.sh@26 -- # sort 00:12:56.744 17:06:52 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:57.003 17:06:52 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:12:57.003 17:06:52 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:12:57.003 17:06:52 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:57.003 17:06:52 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:12:57.003 request: 00:12:57.003 { 00:12:57.003 "method": "env_dpdk_get_mem_stats", 00:12:57.003 "req_id": 1 00:12:57.003 } 00:12:57.003 Got JSON-RPC error response 00:12:57.003 response: 00:12:57.003 { 00:12:57.003 "code": -32601, 00:12:57.003 "message": "Method not found" 00:12:57.003 } 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:57.262 17:06:52 app_cmdline -- app/cmdline.sh@1 -- # killprocess 4088093 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 4088093 ']' 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 4088093 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4088093 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4088093' 00:12:57.262 killing process with pid 4088093 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@967 -- # kill 4088093 00:12:57.262 17:06:52 app_cmdline -- common/autotest_common.sh@972 -- # wait 4088093 00:12:57.522 00:12:57.522 real 0m2.076s 00:12:57.522 user 0m2.512s 00:12:57.522 sys 0m0.619s 00:12:57.522 17:06:52 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:57.522 17:06:52 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:12:57.522 ************************************ 00:12:57.522 END TEST app_cmdline 00:12:57.522 ************************************ 00:12:57.522 17:06:52 -- common/autotest_common.sh@1142 -- # return 0 00:12:57.522 17:06:52 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:12:57.522 17:06:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:57.522 17:06:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:57.522 17:06:52 -- common/autotest_common.sh@10 -- # set +x 00:12:57.522 ************************************ 00:12:57.522 START TEST version 00:12:57.522 ************************************ 00:12:57.522 17:06:52 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:12:57.781 * Looking for test storage... 00:12:57.781 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:12:57.781 17:06:53 version -- app/version.sh@17 -- # get_header_version major 00:12:57.781 17:06:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # cut -f2 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # tr -d '"' 00:12:57.781 17:06:53 version -- app/version.sh@17 -- # major=24 00:12:57.781 17:06:53 version -- app/version.sh@18 -- # get_header_version minor 00:12:57.781 17:06:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # cut -f2 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # tr -d '"' 00:12:57.781 17:06:53 version -- app/version.sh@18 -- # minor=9 00:12:57.781 17:06:53 version -- app/version.sh@19 -- # get_header_version patch 00:12:57.781 17:06:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # cut -f2 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # tr -d '"' 00:12:57.781 17:06:53 version -- app/version.sh@19 -- # patch=0 00:12:57.781 17:06:53 version -- app/version.sh@20 -- # get_header_version suffix 00:12:57.781 17:06:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # cut -f2 00:12:57.781 17:06:53 version -- app/version.sh@14 -- # tr -d '"' 00:12:57.781 17:06:53 version -- app/version.sh@20 -- # suffix=-pre 00:12:57.781 17:06:53 version -- app/version.sh@22 -- # version=24.9 00:12:57.781 17:06:53 version -- app/version.sh@25 -- # (( patch != 0 )) 00:12:57.781 17:06:53 version -- app/version.sh@28 -- # version=24.9rc0 00:12:57.781 17:06:53 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:12:57.781 17:06:53 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:12:57.781 17:06:53 version -- app/version.sh@30 -- # py_version=24.9rc0 00:12:57.781 17:06:53 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:12:57.781 00:12:57.781 real 0m0.198s 00:12:57.781 user 0m0.096s 00:12:57.781 sys 0m0.150s 00:12:57.781 17:06:53 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:57.781 17:06:53 version -- common/autotest_common.sh@10 -- # set +x 00:12:57.781 ************************************ 00:12:57.781 END TEST version 00:12:57.781 ************************************ 00:12:57.781 17:06:53 -- common/autotest_common.sh@1142 -- # return 0 00:12:57.781 17:06:53 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:12:57.781 17:06:53 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:12:57.781 17:06:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:57.781 17:06:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:57.781 17:06:53 -- common/autotest_common.sh@10 -- # set +x 00:12:58.041 ************************************ 00:12:58.041 START TEST blockdev_general 00:12:58.041 ************************************ 00:12:58.041 17:06:53 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:12:58.041 * Looking for test storage... 00:12:58.041 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:12:58.041 17:06:53 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:12:58.041 17:06:53 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=4088560 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:12:58.042 17:06:53 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 4088560 00:12:58.042 17:06:53 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 4088560 ']' 00:12:58.042 17:06:53 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:58.042 17:06:53 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:58.042 17:06:53 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:58.042 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:58.042 17:06:53 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:58.042 17:06:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:58.042 [2024-07-23 17:06:53.403715] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:12:58.042 [2024-07-23 17:06:53.403790] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4088560 ] 00:12:58.300 [2024-07-23 17:06:53.538371] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:58.300 [2024-07-23 17:06:53.588218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:59.237 17:06:54 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:59.237 17:06:54 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:12:59.237 17:06:54 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:12:59.237 17:06:54 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:12:59.237 17:06:54 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:12:59.237 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.237 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.237 [2024-07-23 17:06:54.566257] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:59.237 [2024-07-23 17:06:54.566313] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:12:59.237 00:12:59.237 [2024-07-23 17:06:54.574241] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:59.237 [2024-07-23 17:06:54.574266] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:12:59.237 00:12:59.237 Malloc0 00:12:59.237 Malloc1 00:12:59.237 Malloc2 00:12:59.237 Malloc3 00:12:59.237 Malloc4 00:12:59.497 Malloc5 00:12:59.497 Malloc6 00:12:59.497 Malloc7 00:12:59.497 Malloc8 00:12:59.497 Malloc9 00:12:59.497 [2024-07-23 17:06:54.722825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:12:59.497 [2024-07-23 17:06:54.722876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:59.497 [2024-07-23 17:06:54.722902] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x835090 00:12:59.497 [2024-07-23 17:06:54.722915] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:59.497 [2024-07-23 17:06:54.724237] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:59.497 [2024-07-23 17:06:54.724266] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:12:59.497 TestPT 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:12:59.497 5000+0 records in 00:12:59.497 5000+0 records out 00:12:59.497 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0251358 s, 407 MB/s 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.497 AIO0 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.497 17:06:54 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:12:59.497 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.757 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.757 17:06:54 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.757 17:06:54 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:12:59.757 17:06:54 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:12:59.757 17:06:54 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:59.757 17:06:54 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:12:59.757 17:06:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:59.757 17:06:55 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:59.757 17:06:55 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:12:59.757 17:06:55 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:12:59.758 17:06:55 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b0848f4f-942d-4e56-9637-abca00ff00a6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b0848f4f-942d-4e56-9637-abca00ff00a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "6ff215ca-2865-5148-9996-aa1a950ad325"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6ff215ca-2865-5148-9996-aa1a950ad325",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "39562293-9a35-5f2e-8af2-3ad7e5d7e991"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "39562293-9a35-5f2e-8af2-3ad7e5d7e991",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "a948d93a-5f68-52a4-b793-f663ab5d553e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a948d93a-5f68-52a4-b793-f663ab5d553e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "64e1a375-f827-5617-88b2-953f7a85d996"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "64e1a375-f827-5617-88b2-953f7a85d996",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "4ea92a29-d4a3-5226-9848-6b257cbde9a2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4ea92a29-d4a3-5226-9848-6b257cbde9a2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d767886d-742f-5edf-9020-f7f650447b76"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d767886d-742f-5edf-9020-f7f650447b76",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "02f5ba5c-be70-59fb-857d-32956af0ad1d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "02f5ba5c-be70-59fb-857d-32956af0ad1d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "27166f81-0fe0-550e-b4d4-bea756004117"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "27166f81-0fe0-550e-b4d4-bea756004117",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "00b0614a-d996-5306-8f6f-4ebdb78fd425"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "00b0614a-d996-5306-8f6f-4ebdb78fd425",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "143ad6c7-ccae-532f-a559-1ff901ff1867"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "143ad6c7-ccae-532f-a559-1ff901ff1867",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "9e0a8ca0-3f11-5fab-a64c-cb3ad7471f75"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9e0a8ca0-3f11-5fab-a64c-cb3ad7471f75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "bbd65f9d-6d33-40cf-81d3-be16167c84f8"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bbd65f9d-6d33-40cf-81d3-be16167c84f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bbd65f9d-6d33-40cf-81d3-be16167c84f8",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5b271d0c-f315-4d8f-b027-00d5d9119f9b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5c7a05f5-7c77-4f75-bc0d-e0e6f13626e9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "24da28da-52b3-4e64-8ae9-69e278f3456a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "24da28da-52b3-4e64-8ae9-69e278f3456a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "24da28da-52b3-4e64-8ae9-69e278f3456a",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "bd9b6612-e683-44f0-ace4-5e004d94cd3f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "64665474-c564-4970-8673-0f86db021de3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "57c99b0e-b279-4cfa-9474-e90ce1b053d7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "57c99b0e-b279-4cfa-9474-e90ce1b053d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "57c99b0e-b279-4cfa-9474-e90ce1b053d7",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "149a1b04-526c-4703-bc56-2ede48ae12c5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "26f2816b-98ea-45a0-9e8e-a2285489a612",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "211a8f22-8fdb-4abe-b4e6-ad49a22a3b82"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "211a8f22-8fdb-4abe-b4e6-ad49a22a3b82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:13:00.017 17:06:55 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:13:00.017 17:06:55 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:13:00.017 17:06:55 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:13:00.017 17:06:55 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 4088560 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 4088560 ']' 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 4088560 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4088560 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4088560' 00:13:00.017 killing process with pid 4088560 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@967 -- # kill 4088560 00:13:00.017 17:06:55 blockdev_general -- common/autotest_common.sh@972 -- # wait 4088560 00:13:00.586 17:06:55 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:13:00.586 17:06:55 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:13:00.586 17:06:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:13:00.586 17:06:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:00.586 17:06:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:00.586 ************************************ 00:13:00.586 START TEST bdev_hello_world 00:13:00.586 ************************************ 00:13:00.586 17:06:55 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:13:00.586 [2024-07-23 17:06:55.841965] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:13:00.586 [2024-07-23 17:06:55.842026] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4088930 ] 00:13:00.586 [2024-07-23 17:06:55.963190] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:00.846 [2024-07-23 17:06:56.018003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:00.846 [2024-07-23 17:06:56.169761] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:00.846 [2024-07-23 17:06:56.169817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:13:00.846 [2024-07-23 17:06:56.169832] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:13:00.846 [2024-07-23 17:06:56.177766] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:00.846 [2024-07-23 17:06:56.177793] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:00.846 [2024-07-23 17:06:56.185776] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:00.846 [2024-07-23 17:06:56.185800] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:00.846 [2024-07-23 17:06:56.262863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:00.846 [2024-07-23 17:06:56.262919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:00.846 [2024-07-23 17:06:56.262944] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d10ad0 00:13:00.846 [2024-07-23 17:06:56.262964] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:00.846 [2024-07-23 17:06:56.264559] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:00.846 [2024-07-23 17:06:56.264587] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:13:01.105 [2024-07-23 17:06:56.421341] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:13:01.105 [2024-07-23 17:06:56.421409] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:13:01.105 [2024-07-23 17:06:56.421465] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:13:01.105 [2024-07-23 17:06:56.421536] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:13:01.105 [2024-07-23 17:06:56.421612] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:13:01.105 [2024-07-23 17:06:56.421643] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:13:01.105 [2024-07-23 17:06:56.421705] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:13:01.105 00:13:01.105 [2024-07-23 17:06:56.421745] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:13:01.364 00:13:01.364 real 0m0.969s 00:13:01.364 user 0m0.604s 00:13:01.364 sys 0m0.316s 00:13:01.364 17:06:56 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:01.364 17:06:56 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:13:01.364 ************************************ 00:13:01.364 END TEST bdev_hello_world 00:13:01.364 ************************************ 00:13:01.624 17:06:56 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:13:01.624 17:06:56 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:13:01.624 17:06:56 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:13:01.624 17:06:56 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:01.624 17:06:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:01.624 ************************************ 00:13:01.624 START TEST bdev_bounds 00:13:01.624 ************************************ 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=4089113 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 4089113' 00:13:01.624 Process bdevio pid: 4089113 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 4089113 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 4089113 ']' 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:01.624 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:01.624 17:06:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:13:01.624 [2024-07-23 17:06:56.895508] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:13:01.624 [2024-07-23 17:06:56.895575] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4089113 ] 00:13:01.624 [2024-07-23 17:06:57.028987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:13:01.884 [2024-07-23 17:06:57.087738] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:01.884 [2024-07-23 17:06:57.087839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:13:01.884 [2024-07-23 17:06:57.087840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:01.884 [2024-07-23 17:06:57.238827] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:01.884 [2024-07-23 17:06:57.238882] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:13:01.884 [2024-07-23 17:06:57.238902] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:13:01.884 [2024-07-23 17:06:57.246837] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:01.884 [2024-07-23 17:06:57.246863] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:01.884 [2024-07-23 17:06:57.254850] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:01.884 [2024-07-23 17:06:57.254874] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:02.143 [2024-07-23 17:06:57.332087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:02.143 [2024-07-23 17:06:57.332139] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.143 [2024-07-23 17:06:57.332155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda8a20 00:13:02.143 [2024-07-23 17:06:57.332168] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.143 [2024-07-23 17:06:57.333636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.143 [2024-07-23 17:06:57.333663] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:13:02.711 17:06:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:02.711 17:06:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:13:02.711 17:06:57 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:13:02.711 I/O targets: 00:13:02.711 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:13:02.711 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:13:02.711 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:13:02.711 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:13:02.711 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:13:02.711 raid0: 131072 blocks of 512 bytes (64 MiB) 00:13:02.711 concat0: 131072 blocks of 512 bytes (64 MiB) 00:13:02.711 raid1: 65536 blocks of 512 bytes (32 MiB) 00:13:02.711 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:13:02.711 00:13:02.711 00:13:02.711 CUnit - A unit testing framework for C - Version 2.1-3 00:13:02.711 http://cunit.sourceforge.net/ 00:13:02.711 00:13:02.711 00:13:02.711 Suite: bdevio tests on: AIO0 00:13:02.711 Test: blockdev write read block ...passed 00:13:02.711 Test: blockdev write zeroes read block ...passed 00:13:02.711 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.712 Test: blockdev write zeroes read split partial ...passed 00:13:02.712 Test: blockdev reset ...passed 00:13:02.712 Test: blockdev write read 8 blocks ...passed 00:13:02.712 Test: blockdev write read size > 128k ...passed 00:13:02.712 Test: blockdev write read invalid size ...passed 00:13:02.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.712 Test: blockdev write read max offset ...passed 00:13:02.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.712 Test: blockdev writev readv 8 blocks ...passed 00:13:02.712 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.712 Test: blockdev writev readv block ...passed 00:13:02.712 Test: blockdev writev readv size > 128k ...passed 00:13:02.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.712 Test: blockdev comparev and writev ...passed 00:13:02.712 Test: blockdev nvme passthru rw ...passed 00:13:02.712 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.712 Test: blockdev nvme admin passthru ...passed 00:13:02.712 Test: blockdev copy ...passed 00:13:02.712 Suite: bdevio tests on: raid1 00:13:02.712 Test: blockdev write read block ...passed 00:13:02.712 Test: blockdev write zeroes read block ...passed 00:13:02.712 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.712 Test: blockdev write zeroes read split partial ...passed 00:13:02.712 Test: blockdev reset ...passed 00:13:02.712 Test: blockdev write read 8 blocks ...passed 00:13:02.712 Test: blockdev write read size > 128k ...passed 00:13:02.712 Test: blockdev write read invalid size ...passed 00:13:02.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.712 Test: blockdev write read max offset ...passed 00:13:02.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.712 Test: blockdev writev readv 8 blocks ...passed 00:13:02.712 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.712 Test: blockdev writev readv block ...passed 00:13:02.712 Test: blockdev writev readv size > 128k ...passed 00:13:02.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.712 Test: blockdev comparev and writev ...passed 00:13:02.712 Test: blockdev nvme passthru rw ...passed 00:13:02.712 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.712 Test: blockdev nvme admin passthru ...passed 00:13:02.712 Test: blockdev copy ...passed 00:13:02.712 Suite: bdevio tests on: concat0 00:13:02.712 Test: blockdev write read block ...passed 00:13:02.712 Test: blockdev write zeroes read block ...passed 00:13:02.712 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.712 Test: blockdev write zeroes read split partial ...passed 00:13:02.712 Test: blockdev reset ...passed 00:13:02.712 Test: blockdev write read 8 blocks ...passed 00:13:02.712 Test: blockdev write read size > 128k ...passed 00:13:02.712 Test: blockdev write read invalid size ...passed 00:13:02.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.712 Test: blockdev write read max offset ...passed 00:13:02.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.712 Test: blockdev writev readv 8 blocks ...passed 00:13:02.712 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.712 Test: blockdev writev readv block ...passed 00:13:02.712 Test: blockdev writev readv size > 128k ...passed 00:13:02.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.712 Test: blockdev comparev and writev ...passed 00:13:02.712 Test: blockdev nvme passthru rw ...passed 00:13:02.712 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.712 Test: blockdev nvme admin passthru ...passed 00:13:02.712 Test: blockdev copy ...passed 00:13:02.712 Suite: bdevio tests on: raid0 00:13:02.712 Test: blockdev write read block ...passed 00:13:02.712 Test: blockdev write zeroes read block ...passed 00:13:02.712 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.712 Test: blockdev write zeroes read split partial ...passed 00:13:02.712 Test: blockdev reset ...passed 00:13:02.712 Test: blockdev write read 8 blocks ...passed 00:13:02.712 Test: blockdev write read size > 128k ...passed 00:13:02.712 Test: blockdev write read invalid size ...passed 00:13:02.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.712 Test: blockdev write read max offset ...passed 00:13:02.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.712 Test: blockdev writev readv 8 blocks ...passed 00:13:02.712 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.712 Test: blockdev writev readv block ...passed 00:13:02.712 Test: blockdev writev readv size > 128k ...passed 00:13:02.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.712 Test: blockdev comparev and writev ...passed 00:13:02.712 Test: blockdev nvme passthru rw ...passed 00:13:02.712 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.712 Test: blockdev nvme admin passthru ...passed 00:13:02.712 Test: blockdev copy ...passed 00:13:02.712 Suite: bdevio tests on: TestPT 00:13:02.712 Test: blockdev write read block ...passed 00:13:02.712 Test: blockdev write zeroes read block ...passed 00:13:02.712 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.712 Test: blockdev write zeroes read split partial ...passed 00:13:02.712 Test: blockdev reset ...passed 00:13:02.712 Test: blockdev write read 8 blocks ...passed 00:13:02.712 Test: blockdev write read size > 128k ...passed 00:13:02.712 Test: blockdev write read invalid size ...passed 00:13:02.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.712 Test: blockdev write read max offset ...passed 00:13:02.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.712 Test: blockdev writev readv 8 blocks ...passed 00:13:02.712 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.712 Test: blockdev writev readv block ...passed 00:13:02.712 Test: blockdev writev readv size > 128k ...passed 00:13:02.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.712 Test: blockdev comparev and writev ...passed 00:13:02.712 Test: blockdev nvme passthru rw ...passed 00:13:02.712 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.712 Test: blockdev nvme admin passthru ...passed 00:13:02.712 Test: blockdev copy ...passed 00:13:02.712 Suite: bdevio tests on: Malloc2p7 00:13:02.712 Test: blockdev write read block ...passed 00:13:02.712 Test: blockdev write zeroes read block ...passed 00:13:02.712 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.712 Test: blockdev write zeroes read split partial ...passed 00:13:02.712 Test: blockdev reset ...passed 00:13:02.712 Test: blockdev write read 8 blocks ...passed 00:13:02.712 Test: blockdev write read size > 128k ...passed 00:13:02.712 Test: blockdev write read invalid size ...passed 00:13:02.712 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.712 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.712 Test: blockdev write read max offset ...passed 00:13:02.712 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.712 Test: blockdev writev readv 8 blocks ...passed 00:13:02.712 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.712 Test: blockdev writev readv block ...passed 00:13:02.712 Test: blockdev writev readv size > 128k ...passed 00:13:02.712 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.712 Test: blockdev comparev and writev ...passed 00:13:02.712 Test: blockdev nvme passthru rw ...passed 00:13:02.712 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.712 Test: blockdev nvme admin passthru ...passed 00:13:02.712 Test: blockdev copy ...passed 00:13:02.712 Suite: bdevio tests on: Malloc2p6 00:13:02.712 Test: blockdev write read block ...passed 00:13:02.712 Test: blockdev write zeroes read block ...passed 00:13:02.712 Test: blockdev write zeroes read no split ...passed 00:13:02.712 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc2p5 00:13:02.973 Test: blockdev write read block ...passed 00:13:02.973 Test: blockdev write zeroes read block ...passed 00:13:02.973 Test: blockdev write zeroes read no split ...passed 00:13:02.973 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc2p4 00:13:02.973 Test: blockdev write read block ...passed 00:13:02.973 Test: blockdev write zeroes read block ...passed 00:13:02.973 Test: blockdev write zeroes read no split ...passed 00:13:02.973 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc2p3 00:13:02.973 Test: blockdev write read block ...passed 00:13:02.973 Test: blockdev write zeroes read block ...passed 00:13:02.973 Test: blockdev write zeroes read no split ...passed 00:13:02.973 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc2p2 00:13:02.973 Test: blockdev write read block ...passed 00:13:02.973 Test: blockdev write zeroes read block ...passed 00:13:02.973 Test: blockdev write zeroes read no split ...passed 00:13:02.973 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc2p1 00:13:02.973 Test: blockdev write read block ...passed 00:13:02.973 Test: blockdev write zeroes read block ...passed 00:13:02.973 Test: blockdev write zeroes read no split ...passed 00:13:02.973 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc2p0 00:13:02.973 Test: blockdev write read block ...passed 00:13:02.973 Test: blockdev write zeroes read block ...passed 00:13:02.973 Test: blockdev write zeroes read no split ...passed 00:13:02.973 Test: blockdev write zeroes read split ...passed 00:13:02.973 Test: blockdev write zeroes read split partial ...passed 00:13:02.973 Test: blockdev reset ...passed 00:13:02.973 Test: blockdev write read 8 blocks ...passed 00:13:02.973 Test: blockdev write read size > 128k ...passed 00:13:02.973 Test: blockdev write read invalid size ...passed 00:13:02.973 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.973 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.973 Test: blockdev write read max offset ...passed 00:13:02.973 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.973 Test: blockdev writev readv 8 blocks ...passed 00:13:02.973 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.973 Test: blockdev writev readv block ...passed 00:13:02.973 Test: blockdev writev readv size > 128k ...passed 00:13:02.973 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.973 Test: blockdev comparev and writev ...passed 00:13:02.973 Test: blockdev nvme passthru rw ...passed 00:13:02.973 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.973 Test: blockdev nvme admin passthru ...passed 00:13:02.973 Test: blockdev copy ...passed 00:13:02.973 Suite: bdevio tests on: Malloc1p1 00:13:02.974 Test: blockdev write read block ...passed 00:13:02.974 Test: blockdev write zeroes read block ...passed 00:13:02.974 Test: blockdev write zeroes read no split ...passed 00:13:02.974 Test: blockdev write zeroes read split ...passed 00:13:02.974 Test: blockdev write zeroes read split partial ...passed 00:13:02.974 Test: blockdev reset ...passed 00:13:02.974 Test: blockdev write read 8 blocks ...passed 00:13:02.974 Test: blockdev write read size > 128k ...passed 00:13:02.974 Test: blockdev write read invalid size ...passed 00:13:02.974 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.974 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.974 Test: blockdev write read max offset ...passed 00:13:02.974 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.974 Test: blockdev writev readv 8 blocks ...passed 00:13:02.974 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.974 Test: blockdev writev readv block ...passed 00:13:02.974 Test: blockdev writev readv size > 128k ...passed 00:13:02.974 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.974 Test: blockdev comparev and writev ...passed 00:13:02.974 Test: blockdev nvme passthru rw ...passed 00:13:02.974 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.974 Test: blockdev nvme admin passthru ...passed 00:13:02.974 Test: blockdev copy ...passed 00:13:02.974 Suite: bdevio tests on: Malloc1p0 00:13:02.974 Test: blockdev write read block ...passed 00:13:02.974 Test: blockdev write zeroes read block ...passed 00:13:02.974 Test: blockdev write zeroes read no split ...passed 00:13:02.974 Test: blockdev write zeroes read split ...passed 00:13:02.974 Test: blockdev write zeroes read split partial ...passed 00:13:02.974 Test: blockdev reset ...passed 00:13:02.974 Test: blockdev write read 8 blocks ...passed 00:13:02.974 Test: blockdev write read size > 128k ...passed 00:13:02.974 Test: blockdev write read invalid size ...passed 00:13:02.974 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.974 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.974 Test: blockdev write read max offset ...passed 00:13:02.974 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.974 Test: blockdev writev readv 8 blocks ...passed 00:13:02.974 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.974 Test: blockdev writev readv block ...passed 00:13:02.974 Test: blockdev writev readv size > 128k ...passed 00:13:02.974 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.974 Test: blockdev comparev and writev ...passed 00:13:02.974 Test: blockdev nvme passthru rw ...passed 00:13:02.974 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.974 Test: blockdev nvme admin passthru ...passed 00:13:02.974 Test: blockdev copy ...passed 00:13:02.974 Suite: bdevio tests on: Malloc0 00:13:02.974 Test: blockdev write read block ...passed 00:13:02.974 Test: blockdev write zeroes read block ...passed 00:13:02.974 Test: blockdev write zeroes read no split ...passed 00:13:02.974 Test: blockdev write zeroes read split ...passed 00:13:02.974 Test: blockdev write zeroes read split partial ...passed 00:13:02.974 Test: blockdev reset ...passed 00:13:02.974 Test: blockdev write read 8 blocks ...passed 00:13:02.974 Test: blockdev write read size > 128k ...passed 00:13:02.974 Test: blockdev write read invalid size ...passed 00:13:02.974 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:13:02.974 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:13:02.974 Test: blockdev write read max offset ...passed 00:13:02.974 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:13:02.974 Test: blockdev writev readv 8 blocks ...passed 00:13:02.974 Test: blockdev writev readv 30 x 1block ...passed 00:13:02.974 Test: blockdev writev readv block ...passed 00:13:02.974 Test: blockdev writev readv size > 128k ...passed 00:13:02.974 Test: blockdev writev readv size > 128k in two iovs ...passed 00:13:02.974 Test: blockdev comparev and writev ...passed 00:13:02.974 Test: blockdev nvme passthru rw ...passed 00:13:02.974 Test: blockdev nvme passthru vendor specific ...passed 00:13:02.974 Test: blockdev nvme admin passthru ...passed 00:13:02.974 Test: blockdev copy ...passed 00:13:02.974 00:13:02.974 Run Summary: Type Total Ran Passed Failed Inactive 00:13:02.974 suites 16 16 n/a 0 0 00:13:02.974 tests 368 368 368 0 0 00:13:02.974 asserts 2224 2224 2224 0 n/a 00:13:02.974 00:13:02.974 Elapsed time = 0.682 seconds 00:13:02.974 0 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 4089113 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 4089113 ']' 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 4089113 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4089113 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4089113' 00:13:02.974 killing process with pid 4089113 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 4089113 00:13:02.974 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 4089113 00:13:03.233 17:06:58 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:13:03.233 00:13:03.233 real 0m1.803s 00:13:03.233 user 0m4.553s 00:13:03.233 sys 0m0.520s 00:13:03.233 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:03.233 17:06:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:13:03.233 ************************************ 00:13:03.233 END TEST bdev_bounds 00:13:03.233 ************************************ 00:13:03.493 17:06:58 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:13:03.493 17:06:58 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:13:03.493 17:06:58 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:03.493 17:06:58 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:03.493 17:06:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:03.493 ************************************ 00:13:03.493 START TEST bdev_nbd 00:13:03.493 ************************************ 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=4089332 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 4089332 /var/tmp/spdk-nbd.sock 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 4089332 ']' 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:13:03.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:03.493 17:06:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:13:03.493 [2024-07-23 17:06:58.801395] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:13:03.493 [2024-07-23 17:06:58.801465] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:03.753 [2024-07-23 17:06:58.933460] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:03.753 [2024-07-23 17:06:58.983197] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:03.753 [2024-07-23 17:06:59.127460] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:03.753 [2024-07-23 17:06:59.127508] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:13:03.753 [2024-07-23 17:06:59.127523] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:13:03.753 [2024-07-23 17:06:59.135470] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:03.753 [2024-07-23 17:06:59.135497] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:03.753 [2024-07-23 17:06:59.143481] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:03.753 [2024-07-23 17:06:59.143505] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:04.012 [2024-07-23 17:06:59.215672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:04.012 [2024-07-23 17:06:59.215722] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:04.012 [2024-07-23 17:06:59.215739] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d3b3a0 00:13:04.012 [2024-07-23 17:06:59.215752] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:04.012 [2024-07-23 17:06:59.217153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:04.012 [2024-07-23 17:06:59.217183] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:04.580 17:06:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:04.839 1+0 records in 00:13:04.839 1+0 records out 00:13:04.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243848 s, 16.8 MB/s 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:04.839 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:05.407 1+0 records in 00:13:05.407 1+0 records out 00:13:05.407 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298738 s, 13.7 MB/s 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:05.407 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:05.408 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:05.408 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:05.408 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:05.408 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:13:05.703 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:05.704 1+0 records in 00:13:05.704 1+0 records out 00:13:05.704 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339754 s, 12.1 MB/s 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:05.704 17:07:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:05.962 1+0 records in 00:13:05.962 1+0 records out 00:13:05.962 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000346554 s, 11.8 MB/s 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:05.962 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:06.220 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:06.221 1+0 records in 00:13:06.221 1+0 records out 00:13:06.221 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000440044 s, 9.3 MB/s 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:06.221 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:06.479 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:06.479 1+0 records in 00:13:06.479 1+0 records out 00:13:06.479 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000373545 s, 11.0 MB/s 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:06.480 17:07:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:06.738 1+0 records in 00:13:06.738 1+0 records out 00:13:06.738 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000529878 s, 7.7 MB/s 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:06.738 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:06.996 1+0 records in 00:13:06.996 1+0 records out 00:13:06.996 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000526605 s, 7.8 MB/s 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:06.996 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:07.254 1+0 records in 00:13:07.254 1+0 records out 00:13:07.254 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000418521 s, 9.8 MB/s 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:07.254 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:07.512 1+0 records in 00:13:07.512 1+0 records out 00:13:07.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000555622 s, 7.4 MB/s 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:07.512 17:07:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:07.771 1+0 records in 00:13:07.771 1+0 records out 00:13:07.771 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000646554 s, 6.3 MB/s 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:07.771 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:08.030 1+0 records in 00:13:08.030 1+0 records out 00:13:08.030 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000728383 s, 5.6 MB/s 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:08.030 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:13:08.288 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:08.547 1+0 records in 00:13:08.547 1+0 records out 00:13:08.547 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00074191 s, 5.5 MB/s 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:08.547 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:13:08.805 17:07:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:13:08.805 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:13:08.805 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:08.806 1+0 records in 00:13:08.806 1+0 records out 00:13:08.806 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000529685 s, 7.7 MB/s 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:08.806 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:09.064 1+0 records in 00:13:09.064 1+0 records out 00:13:09.064 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000715741 s, 5.7 MB/s 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:09.064 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:09.325 1+0 records in 00:13:09.325 1+0 records out 00:13:09.325 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000760646 s, 5.4 MB/s 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:13:09.325 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd0", 00:13:09.585 "bdev_name": "Malloc0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd1", 00:13:09.585 "bdev_name": "Malloc1p0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd2", 00:13:09.585 "bdev_name": "Malloc1p1" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd3", 00:13:09.585 "bdev_name": "Malloc2p0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd4", 00:13:09.585 "bdev_name": "Malloc2p1" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd5", 00:13:09.585 "bdev_name": "Malloc2p2" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd6", 00:13:09.585 "bdev_name": "Malloc2p3" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd7", 00:13:09.585 "bdev_name": "Malloc2p4" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd8", 00:13:09.585 "bdev_name": "Malloc2p5" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd9", 00:13:09.585 "bdev_name": "Malloc2p6" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd10", 00:13:09.585 "bdev_name": "Malloc2p7" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd11", 00:13:09.585 "bdev_name": "TestPT" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd12", 00:13:09.585 "bdev_name": "raid0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd13", 00:13:09.585 "bdev_name": "concat0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd14", 00:13:09.585 "bdev_name": "raid1" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd15", 00:13:09.585 "bdev_name": "AIO0" 00:13:09.585 } 00:13:09.585 ]' 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd0", 00:13:09.585 "bdev_name": "Malloc0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd1", 00:13:09.585 "bdev_name": "Malloc1p0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd2", 00:13:09.585 "bdev_name": "Malloc1p1" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd3", 00:13:09.585 "bdev_name": "Malloc2p0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd4", 00:13:09.585 "bdev_name": "Malloc2p1" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd5", 00:13:09.585 "bdev_name": "Malloc2p2" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd6", 00:13:09.585 "bdev_name": "Malloc2p3" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd7", 00:13:09.585 "bdev_name": "Malloc2p4" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd8", 00:13:09.585 "bdev_name": "Malloc2p5" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd9", 00:13:09.585 "bdev_name": "Malloc2p6" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd10", 00:13:09.585 "bdev_name": "Malloc2p7" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd11", 00:13:09.585 "bdev_name": "TestPT" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd12", 00:13:09.585 "bdev_name": "raid0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd13", 00:13:09.585 "bdev_name": "concat0" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd14", 00:13:09.585 "bdev_name": "raid1" 00:13:09.585 }, 00:13:09.585 { 00:13:09.585 "nbd_device": "/dev/nbd15", 00:13:09.585 "bdev_name": "AIO0" 00:13:09.585 } 00:13:09.585 ]' 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:09.585 17:07:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:13:09.844 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:09.845 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:10.103 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:10.362 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:10.621 17:07:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:10.881 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:13:11.140 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:11.141 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:11.400 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:11.660 17:07:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:13:11.919 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:13:11.919 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:13:11.919 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:13:11.919 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:11.919 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:11.920 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:13:11.920 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:11.920 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:11.920 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:11.920 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:12.178 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:12.437 17:07:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:12.697 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:12.956 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:13.215 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:13.475 17:07:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:13.734 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:13.994 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:13:14.253 /dev/nbd0 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:14.512 1+0 records in 00:13:14.512 1+0 records out 00:13:14.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256845 s, 15.9 MB/s 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:13:14.512 /dev/nbd1 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:14.512 1+0 records in 00:13:14.512 1+0 records out 00:13:14.512 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274212 s, 14.9 MB/s 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:14.512 17:07:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:13:14.771 /dev/nbd10 00:13:14.771 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:13:15.029 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:13:15.029 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:13:15.029 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:15.030 1+0 records in 00:13:15.030 1+0 records out 00:13:15.030 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278477 s, 14.7 MB/s 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:13:15.030 /dev/nbd11 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:15.030 1+0 records in 00:13:15.030 1+0 records out 00:13:15.030 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291609 s, 14.0 MB/s 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:15.030 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:13:15.288 /dev/nbd12 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:15.546 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:15.547 1+0 records in 00:13:15.547 1+0 records out 00:13:15.547 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318834 s, 12.8 MB/s 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:15.547 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:13:15.804 /dev/nbd13 00:13:15.804 17:07:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:15.804 1+0 records in 00:13:15.804 1+0 records out 00:13:15.804 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000433272 s, 9.5 MB/s 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:13:15.804 /dev/nbd14 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:15.804 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:16.063 1+0 records in 00:13:16.063 1+0 records out 00:13:16.063 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000418096 s, 9.8 MB/s 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:16.063 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:13:16.063 /dev/nbd15 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:16.323 1+0 records in 00:13:16.323 1+0 records out 00:13:16.323 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000391434 s, 10.5 MB/s 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:13:16.323 /dev/nbd2 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:16.323 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:16.582 1+0 records in 00:13:16.582 1+0 records out 00:13:16.582 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000606137 s, 6.8 MB/s 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:16.582 17:07:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:13:16.842 /dev/nbd3 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:17.101 1+0 records in 00:13:17.101 1+0 records out 00:13:17.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00058924 s, 7.0 MB/s 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:17.101 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:13:17.361 /dev/nbd4 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:17.361 1+0 records in 00:13:17.361 1+0 records out 00:13:17.361 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000543712 s, 7.5 MB/s 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:17.361 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:13:17.620 /dev/nbd5 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:17.620 1+0 records in 00:13:17.620 1+0 records out 00:13:17.620 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000585082 s, 7.0 MB/s 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.620 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:17.621 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.621 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:17.621 17:07:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:17.621 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:17.621 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:17.621 17:07:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:13:17.938 /dev/nbd6 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:17.938 1+0 records in 00:13:17.938 1+0 records out 00:13:17.938 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000683561 s, 6.0 MB/s 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:17.938 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:13:18.197 /dev/nbd7 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:18.197 1+0 records in 00:13:18.197 1+0 records out 00:13:18.197 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000775723 s, 5.3 MB/s 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:18.197 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:13:18.456 /dev/nbd8 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:18.456 1+0 records in 00:13:18.456 1+0 records out 00:13:18.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000601218 s, 6.8 MB/s 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:18.456 17:07:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:13:18.715 /dev/nbd9 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:13:18.715 1+0 records in 00:13:18.715 1+0 records out 00:13:18.715 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000709743 s, 5.8 MB/s 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:18.715 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:13:18.972 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd0", 00:13:18.972 "bdev_name": "Malloc0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd1", 00:13:18.972 "bdev_name": "Malloc1p0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd10", 00:13:18.972 "bdev_name": "Malloc1p1" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd11", 00:13:18.972 "bdev_name": "Malloc2p0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd12", 00:13:18.972 "bdev_name": "Malloc2p1" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd13", 00:13:18.972 "bdev_name": "Malloc2p2" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd14", 00:13:18.972 "bdev_name": "Malloc2p3" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd15", 00:13:18.972 "bdev_name": "Malloc2p4" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd2", 00:13:18.972 "bdev_name": "Malloc2p5" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd3", 00:13:18.972 "bdev_name": "Malloc2p6" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd4", 00:13:18.972 "bdev_name": "Malloc2p7" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd5", 00:13:18.972 "bdev_name": "TestPT" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd6", 00:13:18.972 "bdev_name": "raid0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd7", 00:13:18.972 "bdev_name": "concat0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd8", 00:13:18.972 "bdev_name": "raid1" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd9", 00:13:18.972 "bdev_name": "AIO0" 00:13:18.972 } 00:13:18.972 ]' 00:13:18.972 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd0", 00:13:18.972 "bdev_name": "Malloc0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd1", 00:13:18.972 "bdev_name": "Malloc1p0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd10", 00:13:18.972 "bdev_name": "Malloc1p1" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd11", 00:13:18.972 "bdev_name": "Malloc2p0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd12", 00:13:18.972 "bdev_name": "Malloc2p1" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd13", 00:13:18.972 "bdev_name": "Malloc2p2" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd14", 00:13:18.972 "bdev_name": "Malloc2p3" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd15", 00:13:18.972 "bdev_name": "Malloc2p4" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd2", 00:13:18.972 "bdev_name": "Malloc2p5" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd3", 00:13:18.972 "bdev_name": "Malloc2p6" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd4", 00:13:18.972 "bdev_name": "Malloc2p7" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd5", 00:13:18.972 "bdev_name": "TestPT" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd6", 00:13:18.972 "bdev_name": "raid0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd7", 00:13:18.972 "bdev_name": "concat0" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd8", 00:13:18.972 "bdev_name": "raid1" 00:13:18.972 }, 00:13:18.972 { 00:13:18.972 "nbd_device": "/dev/nbd9", 00:13:18.972 "bdev_name": "AIO0" 00:13:18.972 } 00:13:18.972 ]' 00:13:18.972 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:18.972 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:13:18.972 /dev/nbd1 00:13:18.972 /dev/nbd10 00:13:18.972 /dev/nbd11 00:13:18.972 /dev/nbd12 00:13:18.972 /dev/nbd13 00:13:18.972 /dev/nbd14 00:13:18.972 /dev/nbd15 00:13:18.972 /dev/nbd2 00:13:18.972 /dev/nbd3 00:13:18.972 /dev/nbd4 00:13:18.972 /dev/nbd5 00:13:18.972 /dev/nbd6 00:13:18.972 /dev/nbd7 00:13:18.972 /dev/nbd8 00:13:18.972 /dev/nbd9' 00:13:18.972 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:13:18.972 /dev/nbd1 00:13:18.972 /dev/nbd10 00:13:18.972 /dev/nbd11 00:13:18.972 /dev/nbd12 00:13:18.972 /dev/nbd13 00:13:18.972 /dev/nbd14 00:13:18.972 /dev/nbd15 00:13:18.972 /dev/nbd2 00:13:18.972 /dev/nbd3 00:13:18.972 /dev/nbd4 00:13:18.972 /dev/nbd5 00:13:18.972 /dev/nbd6 00:13:18.972 /dev/nbd7 00:13:18.972 /dev/nbd8 00:13:18.972 /dev/nbd9' 00:13:18.972 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:19.230 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:13:19.230 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:13:19.230 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:13:19.230 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:13:19.230 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:13:19.230 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:13:19.231 256+0 records in 00:13:19.231 256+0 records out 00:13:19.231 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115061 s, 91.1 MB/s 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:13:19.231 256+0 records in 00:13:19.231 256+0 records out 00:13:19.231 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.180951 s, 5.8 MB/s 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:19.231 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:13:19.489 256+0 records in 00:13:19.489 256+0 records out 00:13:19.489 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183081 s, 5.7 MB/s 00:13:19.489 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:19.489 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:13:19.747 256+0 records in 00:13:19.747 256+0 records out 00:13:19.747 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182462 s, 5.7 MB/s 00:13:19.747 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:19.747 17:07:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:13:19.747 256+0 records in 00:13:19.747 256+0 records out 00:13:19.747 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183112 s, 5.7 MB/s 00:13:19.747 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:19.747 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:13:20.005 256+0 records in 00:13:20.005 256+0 records out 00:13:20.005 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182954 s, 5.7 MB/s 00:13:20.005 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:20.005 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:13:20.264 256+0 records in 00:13:20.264 256+0 records out 00:13:20.264 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183204 s, 5.7 MB/s 00:13:20.264 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:20.264 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:13:20.522 256+0 records in 00:13:20.523 256+0 records out 00:13:20.523 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183261 s, 5.7 MB/s 00:13:20.523 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:20.523 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:13:20.523 256+0 records in 00:13:20.523 256+0 records out 00:13:20.523 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.171394 s, 6.1 MB/s 00:13:20.523 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:20.523 17:07:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:13:20.781 256+0 records in 00:13:20.781 256+0 records out 00:13:20.781 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183029 s, 5.7 MB/s 00:13:20.781 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:20.781 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:13:21.039 256+0 records in 00:13:21.039 256+0 records out 00:13:21.039 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182795 s, 5.7 MB/s 00:13:21.039 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:21.039 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:13:21.298 256+0 records in 00:13:21.298 256+0 records out 00:13:21.298 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182944 s, 5.7 MB/s 00:13:21.298 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:21.298 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:13:21.298 256+0 records in 00:13:21.298 256+0 records out 00:13:21.298 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183021 s, 5.7 MB/s 00:13:21.298 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:21.298 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:13:21.557 256+0 records in 00:13:21.557 256+0 records out 00:13:21.557 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183718 s, 5.7 MB/s 00:13:21.557 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:21.557 17:07:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:13:21.816 256+0 records in 00:13:21.816 256+0 records out 00:13:21.816 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184781 s, 5.7 MB/s 00:13:21.816 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:21.816 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:13:21.816 256+0 records in 00:13:21.816 256+0 records out 00:13:21.816 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.186963 s, 5.6 MB/s 00:13:21.816 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:13:21.816 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:13:22.075 256+0 records in 00:13:22.075 256+0 records out 00:13:22.075 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18183 s, 5.8 MB/s 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.075 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:22.335 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:22.594 17:07:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:22.853 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:23.112 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:23.371 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:23.631 17:07:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:23.890 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:24.149 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:24.408 17:07:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:24.667 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:24.926 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:25.185 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:25.444 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:25.445 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:25.704 17:07:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:25.963 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:26.222 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:13:26.481 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:13:26.481 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:13:26.481 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:13:26.741 17:07:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:13:26.741 malloc_lvol_verify 00:13:26.741 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:13:27.000 508c227f-3533-4957-8f6c-a242390e1a43 00:13:27.000 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:13:27.259 74f7f1b7-f26c-4327-85dd-8827ec7afbfb 00:13:27.259 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:13:27.518 /dev/nbd0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:13:27.518 mke2fs 1.46.5 (30-Dec-2021) 00:13:27.518 Discarding device blocks: 0/4096 done 00:13:27.518 Creating filesystem with 4096 1k blocks and 1024 inodes 00:13:27.518 00:13:27.518 Allocating group tables: 0/1 done 00:13:27.518 Writing inode tables: 0/1 done 00:13:27.518 Creating journal (1024 blocks): done 00:13:27.518 Writing superblocks and filesystem accounting information: 0/1 done 00:13:27.518 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 4089332 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 4089332 ']' 00:13:27.518 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 4089332 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4089332 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4089332' 00:13:27.777 killing process with pid 4089332 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 4089332 00:13:27.777 17:07:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 4089332 00:13:28.037 17:07:23 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:13:28.037 00:13:28.037 real 0m24.559s 00:13:28.037 user 0m29.939s 00:13:28.037 sys 0m14.230s 00:13:28.037 17:07:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:28.037 17:07:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:13:28.037 ************************************ 00:13:28.037 END TEST bdev_nbd 00:13:28.037 ************************************ 00:13:28.037 17:07:23 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:13:28.037 17:07:23 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:13:28.037 17:07:23 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:13:28.037 17:07:23 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:13:28.037 17:07:23 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:13:28.037 17:07:23 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:13:28.037 17:07:23 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:28.037 17:07:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:28.037 ************************************ 00:13:28.037 START TEST bdev_fio 00:13:28.037 ************************************ 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:13:28.037 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:13:28.037 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:13:28.038 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:28.297 17:07:23 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:13:28.297 ************************************ 00:13:28.297 START TEST bdev_fio_rw_verify 00:13:28.297 ************************************ 00:13:28.297 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:28.297 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:28.297 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:13:28.297 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:13:28.298 17:07:23 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:28.556 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.556 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.556 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.556 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:28.557 fio-3.35 00:13:28.557 Starting 16 threads 00:13:40.806 00:13:40.806 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=4093350: Tue Jul 23 17:07:35 2024 00:13:40.806 read: IOPS=88.5k, BW=346MiB/s (363MB/s)(3459MiB/10001msec) 00:13:40.806 slat (nsec): min=1954, max=388404, avg=36014.66, stdev=16868.16 00:13:40.806 clat (usec): min=11, max=3874, avg=292.21, stdev=148.24 00:13:40.806 lat (usec): min=20, max=3912, avg=328.22, stdev=158.00 00:13:40.806 clat percentiles (usec): 00:13:40.806 | 50.000th=[ 273], 99.000th=[ 668], 99.900th=[ 791], 99.990th=[ 922], 00:13:40.806 | 99.999th=[ 1090] 00:13:40.806 write: IOPS=137k, BW=537MiB/s (563MB/s)(5306MiB/9882msec); 0 zone resets 00:13:40.806 slat (usec): min=4, max=272, avg=50.66, stdev=19.09 00:13:40.806 clat (usec): min=11, max=2286, avg=349.15, stdev=175.47 00:13:40.806 lat (usec): min=32, max=2301, avg=399.82, stdev=186.44 00:13:40.806 clat percentiles (usec): 00:13:40.806 | 50.000th=[ 326], 99.000th=[ 873], 99.900th=[ 1004], 99.990th=[ 1090], 00:13:40.806 | 99.999th=[ 1418] 00:13:40.807 bw ( KiB/s): min=411240, max=765927, per=99.66%, avg=547935.32, stdev=8098.80, samples=304 00:13:40.807 iops : min=102810, max=191480, avg=136983.63, stdev=2024.69, samples=304 00:13:40.807 lat (usec) : 20=0.02%, 50=0.76%, 100=4.73%, 250=31.57%, 500=47.27% 00:13:40.807 lat (usec) : 750=14.02%, 1000=1.57% 00:13:40.807 lat (msec) : 2=0.07%, 4=0.01% 00:13:40.807 cpu : usr=99.18%, sys=0.38%, ctx=586, majf=0, minf=1881 00:13:40.807 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:13:40.807 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:40.807 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:40.807 issued rwts: total=885549,1358229,0,0 short=0,0,0,0 dropped=0,0,0,0 00:13:40.807 latency : target=0, window=0, percentile=100.00%, depth=8 00:13:40.807 00:13:40.807 Run status group 0 (all jobs): 00:13:40.807 READ: bw=346MiB/s (363MB/s), 346MiB/s-346MiB/s (363MB/s-363MB/s), io=3459MiB (3627MB), run=10001-10001msec 00:13:40.807 WRITE: bw=537MiB/s (563MB/s), 537MiB/s-537MiB/s (563MB/s-563MB/s), io=5306MiB (5563MB), run=9882-9882msec 00:13:40.807 00:13:40.807 real 0m11.995s 00:13:40.807 user 2m45.656s 00:13:40.807 sys 0m1.601s 00:13:40.807 17:07:35 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:40.807 17:07:35 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:13:40.807 ************************************ 00:13:40.807 END TEST bdev_fio_rw_verify 00:13:40.807 ************************************ 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:13:40.807 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:13:40.808 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b0848f4f-942d-4e56-9637-abca00ff00a6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b0848f4f-942d-4e56-9637-abca00ff00a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "6ff215ca-2865-5148-9996-aa1a950ad325"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6ff215ca-2865-5148-9996-aa1a950ad325",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "39562293-9a35-5f2e-8af2-3ad7e5d7e991"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "39562293-9a35-5f2e-8af2-3ad7e5d7e991",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "a948d93a-5f68-52a4-b793-f663ab5d553e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a948d93a-5f68-52a4-b793-f663ab5d553e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "64e1a375-f827-5617-88b2-953f7a85d996"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "64e1a375-f827-5617-88b2-953f7a85d996",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "4ea92a29-d4a3-5226-9848-6b257cbde9a2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4ea92a29-d4a3-5226-9848-6b257cbde9a2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d767886d-742f-5edf-9020-f7f650447b76"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d767886d-742f-5edf-9020-f7f650447b76",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "02f5ba5c-be70-59fb-857d-32956af0ad1d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "02f5ba5c-be70-59fb-857d-32956af0ad1d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "27166f81-0fe0-550e-b4d4-bea756004117"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "27166f81-0fe0-550e-b4d4-bea756004117",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "00b0614a-d996-5306-8f6f-4ebdb78fd425"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "00b0614a-d996-5306-8f6f-4ebdb78fd425",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "143ad6c7-ccae-532f-a559-1ff901ff1867"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "143ad6c7-ccae-532f-a559-1ff901ff1867",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "9e0a8ca0-3f11-5fab-a64c-cb3ad7471f75"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9e0a8ca0-3f11-5fab-a64c-cb3ad7471f75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "bbd65f9d-6d33-40cf-81d3-be16167c84f8"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bbd65f9d-6d33-40cf-81d3-be16167c84f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bbd65f9d-6d33-40cf-81d3-be16167c84f8",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5b271d0c-f315-4d8f-b027-00d5d9119f9b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5c7a05f5-7c77-4f75-bc0d-e0e6f13626e9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "24da28da-52b3-4e64-8ae9-69e278f3456a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "24da28da-52b3-4e64-8ae9-69e278f3456a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "24da28da-52b3-4e64-8ae9-69e278f3456a",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "bd9b6612-e683-44f0-ace4-5e004d94cd3f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "64665474-c564-4970-8673-0f86db021de3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "57c99b0e-b279-4cfa-9474-e90ce1b053d7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "57c99b0e-b279-4cfa-9474-e90ce1b053d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "57c99b0e-b279-4cfa-9474-e90ce1b053d7",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "149a1b04-526c-4703-bc56-2ede48ae12c5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "26f2816b-98ea-45a0-9e8e-a2285489a612",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "211a8f22-8fdb-4abe-b4e6-ad49a22a3b82"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "211a8f22-8fdb-4abe-b4e6-ad49a22a3b82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:13:40.808 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:13:40.808 Malloc1p0 00:13:40.808 Malloc1p1 00:13:40.808 Malloc2p0 00:13:40.808 Malloc2p1 00:13:40.808 Malloc2p2 00:13:40.808 Malloc2p3 00:13:40.808 Malloc2p4 00:13:40.808 Malloc2p5 00:13:40.808 Malloc2p6 00:13:40.808 Malloc2p7 00:13:40.808 TestPT 00:13:40.808 raid0 00:13:40.808 concat0 ]] 00:13:40.808 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:13:40.809 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b0848f4f-942d-4e56-9637-abca00ff00a6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b0848f4f-942d-4e56-9637-abca00ff00a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "6ff215ca-2865-5148-9996-aa1a950ad325"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6ff215ca-2865-5148-9996-aa1a950ad325",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "39562293-9a35-5f2e-8af2-3ad7e5d7e991"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "39562293-9a35-5f2e-8af2-3ad7e5d7e991",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "a948d93a-5f68-52a4-b793-f663ab5d553e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a948d93a-5f68-52a4-b793-f663ab5d553e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "64e1a375-f827-5617-88b2-953f7a85d996"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "64e1a375-f827-5617-88b2-953f7a85d996",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "4ea92a29-d4a3-5226-9848-6b257cbde9a2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4ea92a29-d4a3-5226-9848-6b257cbde9a2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d767886d-742f-5edf-9020-f7f650447b76"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d767886d-742f-5edf-9020-f7f650447b76",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "02f5ba5c-be70-59fb-857d-32956af0ad1d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "02f5ba5c-be70-59fb-857d-32956af0ad1d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "27166f81-0fe0-550e-b4d4-bea756004117"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "27166f81-0fe0-550e-b4d4-bea756004117",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "00b0614a-d996-5306-8f6f-4ebdb78fd425"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "00b0614a-d996-5306-8f6f-4ebdb78fd425",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "143ad6c7-ccae-532f-a559-1ff901ff1867"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "143ad6c7-ccae-532f-a559-1ff901ff1867",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "9e0a8ca0-3f11-5fab-a64c-cb3ad7471f75"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9e0a8ca0-3f11-5fab-a64c-cb3ad7471f75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "bbd65f9d-6d33-40cf-81d3-be16167c84f8"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "bbd65f9d-6d33-40cf-81d3-be16167c84f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "bbd65f9d-6d33-40cf-81d3-be16167c84f8",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5b271d0c-f315-4d8f-b027-00d5d9119f9b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5c7a05f5-7c77-4f75-bc0d-e0e6f13626e9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "24da28da-52b3-4e64-8ae9-69e278f3456a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "24da28da-52b3-4e64-8ae9-69e278f3456a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "24da28da-52b3-4e64-8ae9-69e278f3456a",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "bd9b6612-e683-44f0-ace4-5e004d94cd3f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "64665474-c564-4970-8673-0f86db021de3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "57c99b0e-b279-4cfa-9474-e90ce1b053d7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "57c99b0e-b279-4cfa-9474-e90ce1b053d7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "57c99b0e-b279-4cfa-9474-e90ce1b053d7",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "149a1b04-526c-4703-bc56-2ede48ae12c5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "26f2816b-98ea-45a0-9e8e-a2285489a612",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "211a8f22-8fdb-4abe-b4e6-ad49a22a3b82"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "211a8f22-8fdb-4abe-b4e6-ad49a22a3b82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:13:40.809 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.809 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:13:40.809 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:13:40.809 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.809 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:40.810 17:07:35 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:13:40.810 ************************************ 00:13:40.810 START TEST bdev_fio_trim 00:13:40.810 ************************************ 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:13:40.810 17:07:35 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:13:40.810 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:13:40.810 fio-3.35 00:13:40.810 Starting 14 threads 00:13:53.027 00:13:53.027 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=4095056: Tue Jul 23 17:07:46 2024 00:13:53.027 write: IOPS=119k, BW=463MiB/s (485MB/s)(4630MiB/10001msec); 0 zone resets 00:13:53.027 slat (usec): min=3, max=4074, avg=41.95, stdev=11.68 00:13:53.027 clat (usec): min=32, max=1647, avg=293.08, stdev=99.58 00:13:53.027 lat (usec): min=52, max=4516, avg=335.04, stdev=103.72 00:13:53.027 clat percentiles (usec): 00:13:53.027 | 50.000th=[ 285], 99.000th=[ 519], 99.900th=[ 619], 99.990th=[ 938], 00:13:53.027 | 99.999th=[ 1106] 00:13:53.027 bw ( KiB/s): min=421056, max=540865, per=100.00%, avg=474242.11, stdev=2151.37, samples=266 00:13:53.027 iops : min=105264, max=135216, avg=118559.47, stdev=537.85, samples=266 00:13:53.027 trim: IOPS=119k, BW=463MiB/s (485MB/s)(4630MiB/10001msec); 0 zone resets 00:13:53.027 slat (usec): min=4, max=716, avg=28.73, stdev= 7.54 00:13:53.027 clat (usec): min=22, max=4516, avg=335.22, stdev=103.75 00:13:53.027 lat (usec): min=42, max=4538, avg=363.95, stdev=106.66 00:13:53.027 clat percentiles (usec): 00:13:53.027 | 50.000th=[ 326], 99.000th=[ 578], 99.900th=[ 676], 99.990th=[ 1045], 00:13:53.027 | 99.999th=[ 1221] 00:13:53.027 bw ( KiB/s): min=421056, max=540865, per=100.00%, avg=474242.11, stdev=2151.37, samples=266 00:13:53.027 iops : min=105264, max=135216, avg=118559.47, stdev=537.85, samples=266 00:13:53.027 lat (usec) : 50=0.02%, 100=0.48%, 250=30.33%, 500=65.57%, 750=3.57% 00:13:53.027 lat (usec) : 1000=0.02% 00:13:53.027 lat (msec) : 2=0.01%, 10=0.01% 00:13:53.027 cpu : usr=99.57%, sys=0.01%, ctx=681, majf=0, minf=1117 00:13:53.027 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:13:53.027 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:53.027 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:53.027 issued rwts: total=0,1185383,1185386,0 short=0,0,0,0 dropped=0,0,0,0 00:13:53.027 latency : target=0, window=0, percentile=100.00%, depth=8 00:13:53.027 00:13:53.027 Run status group 0 (all jobs): 00:13:53.027 WRITE: bw=463MiB/s (485MB/s), 463MiB/s-463MiB/s (485MB/s-485MB/s), io=4630MiB (4855MB), run=10001-10001msec 00:13:53.027 TRIM: bw=463MiB/s (485MB/s), 463MiB/s-463MiB/s (485MB/s-485MB/s), io=4630MiB (4855MB), run=10001-10001msec 00:13:53.027 00:13:53.027 real 0m11.583s 00:13:53.027 user 2m25.381s 00:13:53.027 sys 0m0.952s 00:13:53.027 17:07:47 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:53.027 17:07:47 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:13:53.027 ************************************ 00:13:53.027 END TEST bdev_fio_trim 00:13:53.027 ************************************ 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:13:53.027 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:13:53.027 00:13:53.027 real 0m23.980s 00:13:53.027 user 5m11.261s 00:13:53.027 sys 0m2.763s 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:53.027 17:07:47 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:13:53.027 ************************************ 00:13:53.027 END TEST bdev_fio 00:13:53.027 ************************************ 00:13:53.027 17:07:47 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:13:53.027 17:07:47 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:13:53.028 17:07:47 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:13:53.028 17:07:47 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:13:53.028 17:07:47 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:53.028 17:07:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:53.028 ************************************ 00:13:53.028 START TEST bdev_verify 00:13:53.028 ************************************ 00:13:53.028 17:07:47 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:13:53.028 [2024-07-23 17:07:47.508834] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:13:53.028 [2024-07-23 17:07:47.508901] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4096494 ] 00:13:53.028 [2024-07-23 17:07:47.641505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:13:53.028 [2024-07-23 17:07:47.696871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:53.028 [2024-07-23 17:07:47.696876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:53.028 [2024-07-23 17:07:47.846340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:53.028 [2024-07-23 17:07:47.846385] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:13:53.028 [2024-07-23 17:07:47.846400] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:13:53.028 [2024-07-23 17:07:47.854350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:53.028 [2024-07-23 17:07:47.854378] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:53.028 [2024-07-23 17:07:47.862360] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:53.028 [2024-07-23 17:07:47.862384] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:53.028 [2024-07-23 17:07:47.939636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:53.028 [2024-07-23 17:07:47.939688] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:53.028 [2024-07-23 17:07:47.939705] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ff1f0 00:13:53.028 [2024-07-23 17:07:47.939717] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:53.028 [2024-07-23 17:07:47.941343] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:53.028 [2024-07-23 17:07:47.941373] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:13:53.028 Running I/O for 5 seconds... 00:13:58.300 00:13:58.300 Latency(us) 00:13:58.300 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:58.300 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.300 Verification LBA range: start 0x0 length 0x1000 00:13:58.300 Malloc0 : 5.08 1158.03 4.52 0.00 0.00 110318.68 537.82 355604.03 00:13:58.300 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.300 Verification LBA range: start 0x1000 length 0x1000 00:13:58.300 Malloc0 : 5.15 943.63 3.69 0.00 0.00 135342.67 698.10 408488.74 00:13:58.300 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.300 Verification LBA range: start 0x0 length 0x800 00:13:58.300 Malloc1p0 : 5.09 603.90 2.36 0.00 0.00 210972.81 2592.95 179625.63 00:13:58.300 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.300 Verification LBA range: start 0x800 length 0x800 00:13:58.300 Malloc1p0 : 5.16 496.39 1.94 0.00 0.00 256515.17 3305.29 221568.67 00:13:58.300 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.300 Verification LBA range: start 0x0 length 0x800 00:13:58.300 Malloc1p1 : 5.09 603.65 2.36 0.00 0.00 210610.84 2521.71 178713.82 00:13:58.300 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.300 Verification LBA range: start 0x800 length 0x800 00:13:58.300 Malloc1p1 : 5.16 496.13 1.94 0.00 0.00 256010.77 3248.31 221568.67 00:13:58.301 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p0 : 5.09 603.40 2.36 0.00 0.00 210252.58 2535.96 178713.82 00:13:58.301 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p0 : 5.16 495.87 1.94 0.00 0.00 255505.01 3789.69 220656.86 00:13:58.301 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p1 : 5.20 615.17 2.40 0.00 0.00 205834.11 3291.05 175978.41 00:13:58.301 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p1 : 5.17 495.61 1.94 0.00 0.00 254865.18 4188.61 217921.45 00:13:58.301 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p2 : 5.20 614.92 2.40 0.00 0.00 205404.66 3675.71 171419.38 00:13:58.301 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p2 : 5.17 495.36 1.93 0.00 0.00 254155.35 3462.01 214274.23 00:13:58.301 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p3 : 5.21 614.67 2.40 0.00 0.00 204941.69 2806.65 169595.77 00:13:58.301 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p3 : 5.17 495.09 1.93 0.00 0.00 253601.81 3234.06 214274.23 00:13:58.301 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p4 : 5.21 614.42 2.40 0.00 0.00 204581.29 2507.46 169595.77 00:13:58.301 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p4 : 5.17 494.81 1.93 0.00 0.00 253118.35 3632.97 214274.23 00:13:58.301 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p5 : 5.21 614.17 2.40 0.00 0.00 204234.20 2493.22 171419.38 00:13:58.301 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p5 : 5.25 511.80 2.00 0.00 0.00 244117.76 4217.10 212450.62 00:13:58.301 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p6 : 5.21 613.92 2.40 0.00 0.00 203888.67 3034.60 171419.38 00:13:58.301 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p6 : 5.25 511.56 2.00 0.00 0.00 243440.41 3063.10 209715.20 00:13:58.301 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x200 00:13:58.301 Malloc2p7 : 5.21 613.66 2.40 0.00 0.00 203491.27 3818.18 167772.16 00:13:58.301 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x200 length 0x200 00:13:58.301 Malloc2p7 : 5.26 511.32 2.00 0.00 0.00 242866.12 3903.67 205156.17 00:13:58.301 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x1000 00:13:58.301 TestPT : 5.22 598.92 2.34 0.00 0.00 207269.32 10599.74 164124.94 00:13:58.301 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x1000 length 0x1000 00:13:58.301 TestPT : 5.24 488.66 1.91 0.00 0.00 253246.33 37156.06 206067.98 00:13:58.301 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x2000 00:13:58.301 raid0 : 5.22 613.24 2.40 0.00 0.00 202465.91 3462.01 156830.50 00:13:58.301 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x2000 length 0x2000 00:13:58.301 raid0 : 5.26 511.07 2.00 0.00 0.00 241507.80 3077.34 189655.49 00:13:58.301 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x2000 00:13:58.301 concat0 : 5.22 612.99 2.39 0.00 0.00 202023.14 2308.01 156830.50 00:13:58.301 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x2000 length 0x2000 00:13:58.301 concat0 : 5.26 510.82 2.00 0.00 0.00 240977.66 3447.76 187831.87 00:13:58.301 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x1000 00:13:58.301 raid1 : 5.22 612.73 2.39 0.00 0.00 201641.62 3205.57 167772.16 00:13:58.301 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x1000 length 0x1000 00:13:58.301 raid1 : 5.26 510.56 1.99 0.00 0.00 240267.90 4160.11 194214.51 00:13:58.301 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x0 length 0x4e2 00:13:58.301 AIO0 : 5.22 612.54 2.39 0.00 0.00 201238.04 1310.72 175978.41 00:13:58.301 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:13:58.301 Verification LBA range: start 0x4e2 length 0x4e2 00:13:58.301 AIO0 : 5.27 510.37 1.99 0.00 0.00 239587.31 1688.26 197861.73 00:13:58.301 =================================================================================================================== 00:13:58.301 Total : 18799.35 73.43 0.00 0.00 213432.74 537.82 408488.74 00:13:58.560 00:13:58.560 real 0m6.420s 00:13:58.560 user 0m11.905s 00:13:58.560 sys 0m0.427s 00:13:58.560 17:07:53 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:58.560 17:07:53 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:13:58.560 ************************************ 00:13:58.560 END TEST bdev_verify 00:13:58.560 ************************************ 00:13:58.560 17:07:53 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:13:58.560 17:07:53 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:13:58.560 17:07:53 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:13:58.560 17:07:53 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:58.560 17:07:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:13:58.560 ************************************ 00:13:58.560 START TEST bdev_verify_big_io 00:13:58.560 ************************************ 00:13:58.560 17:07:53 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:13:58.819 [2024-07-23 17:07:54.060661] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:13:58.819 [2024-07-23 17:07:54.060792] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4097382 ] 00:13:59.078 [2024-07-23 17:07:54.267490] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:13:59.078 [2024-07-23 17:07:54.327487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:13:59.079 [2024-07-23 17:07:54.327491] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:59.079 [2024-07-23 17:07:54.481975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:59.079 [2024-07-23 17:07:54.482033] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:13:59.079 [2024-07-23 17:07:54.482047] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:13:59.079 [2024-07-23 17:07:54.489970] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:59.079 [2024-07-23 17:07:54.489999] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:13:59.079 [2024-07-23 17:07:54.497978] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:59.079 [2024-07-23 17:07:54.498002] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:13:59.338 [2024-07-23 17:07:54.575053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:13:59.338 [2024-07-23 17:07:54.575104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:59.338 [2024-07-23 17:07:54.575120] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ec1f0 00:13:59.338 [2024-07-23 17:07:54.575133] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:59.338 [2024-07-23 17:07:54.576803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:59.338 [2024-07-23 17:07:54.576831] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:13:59.338 [2024-07-23 17:07:54.749687] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:13:59.338 [2024-07-23 17:07:54.750968] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:13:59.338 [2024-07-23 17:07:54.752810] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:13:59.338 [2024-07-23 17:07:54.754075] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:13:59.338 [2024-07-23 17:07:54.755890] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:13:59.338 [2024-07-23 17:07:54.757149] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:13:59.338 [2024-07-23 17:07:54.758981] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.760653] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.761632] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.763135] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.764102] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.765567] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.766551] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.768049] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.769035] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.770496] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:13:59.597 [2024-07-23 17:07:54.795710] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:13:59.597 [2024-07-23 17:07:54.797785] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:13:59.597 Running I/O for 5 seconds... 00:14:07.710 00:14:07.710 Latency(us) 00:14:07.710 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:07.710 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x0 length 0x100 00:14:07.710 Malloc0 : 6.03 169.95 10.62 0.00 0.00 737808.28 872.63 2042443.69 00:14:07.710 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x100 length 0x100 00:14:07.710 Malloc0 : 6.77 151.22 9.45 0.00 0.00 725403.41 1111.26 1546421.65 00:14:07.710 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x0 length 0x80 00:14:07.710 Malloc1p0 : 6.80 35.29 2.21 0.00 0.00 3300778.56 1481.68 5602131.26 00:14:07.710 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x80 length 0x80 00:14:07.710 Malloc1p0 : 7.05 34.04 2.13 0.00 0.00 3081013.19 1852.10 5485420.19 00:14:07.710 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x0 length 0x80 00:14:07.710 Malloc1p1 : 6.80 35.29 2.21 0.00 0.00 3189260.57 1552.92 5397886.89 00:14:07.710 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x80 length 0x80 00:14:07.710 Malloc1p1 : 7.06 34.00 2.12 0.00 0.00 2973118.20 1837.86 5251998.05 00:14:07.710 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x0 length 0x20 00:14:07.710 Malloc2p0 : 6.30 22.85 1.43 0.00 0.00 1220871.26 658.92 2188332.52 00:14:07.710 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x20 length 0x20 00:14:07.710 Malloc2p0 : 6.97 20.67 1.29 0.00 0.00 1179460.63 747.97 1779843.78 00:14:07.710 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x0 length 0x20 00:14:07.710 Malloc2p1 : 6.30 22.85 1.43 0.00 0.00 1209977.53 637.55 2159154.75 00:14:07.710 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x20 length 0x20 00:14:07.710 Malloc2p1 : 7.03 22.75 1.42 0.00 0.00 1075449.49 758.65 1750666.02 00:14:07.710 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x0 length 0x20 00:14:07.710 Malloc2p2 : 6.42 24.93 1.56 0.00 0.00 1117174.97 626.87 2129976.99 00:14:07.710 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.710 Verification LBA range: start 0x20 length 0x20 00:14:07.710 Malloc2p2 : 7.03 22.75 1.42 0.00 0.00 1064054.68 787.14 1714193.81 00:14:07.710 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x20 00:14:07.711 Malloc2p3 : 6.42 24.92 1.56 0.00 0.00 1107377.70 641.11 2100799.22 00:14:07.711 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x20 length 0x20 00:14:07.711 Malloc2p3 : 7.04 22.74 1.42 0.00 0.00 1052655.74 794.27 1692310.48 00:14:07.711 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x20 00:14:07.711 Malloc2p4 : 6.42 24.92 1.56 0.00 0.00 1096979.07 633.99 2071621.45 00:14:07.711 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x20 length 0x20 00:14:07.711 Malloc2p4 : 7.04 22.73 1.42 0.00 0.00 1041830.63 780.02 1655838.27 00:14:07.711 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x20 00:14:07.711 Malloc2p5 : 6.42 24.91 1.56 0.00 0.00 1085767.49 630.43 2042443.69 00:14:07.711 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x20 length 0x20 00:14:07.711 Malloc2p5 : 7.04 22.73 1.42 0.00 0.00 1030899.92 780.02 1626660.51 00:14:07.711 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x20 00:14:07.711 Malloc2p6 : 6.42 24.91 1.56 0.00 0.00 1074587.50 651.80 2013265.92 00:14:07.711 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x20 length 0x20 00:14:07.711 Malloc2p6 : 7.04 22.72 1.42 0.00 0.00 1019179.29 790.71 1597482.74 00:14:07.711 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x20 00:14:07.711 Malloc2p7 : 6.42 24.90 1.56 0.00 0.00 1064014.90 633.99 1984088.15 00:14:07.711 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x20 length 0x20 00:14:07.711 Malloc2p7 : 7.05 24.95 1.56 0.00 0.00 924367.97 776.46 1575599.42 00:14:07.711 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x100 00:14:07.711 TestPT : 6.86 35.29 2.21 0.00 0.00 2879552.46 108960.72 3822287.47 00:14:07.711 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x100 length 0x100 00:14:07.711 TestPT : 7.06 38.54 2.41 0.00 0.00 2333625.29 10656.72 4172420.67 00:14:07.711 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x200 00:14:07.711 raid0 : 6.77 40.20 2.51 0.00 0.00 2450145.11 1595.66 4726798.25 00:14:07.711 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x200 length 0x200 00:14:07.711 raid0 : 6.76 63.28 3.96 0.00 0.00 1775701.47 3105.84 3370032.08 00:14:07.711 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x200 00:14:07.711 concat0 : 6.80 51.73 3.23 0.00 0.00 1873627.66 1588.54 4551731.65 00:14:07.711 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x200 length 0x200 00:14:07.711 concat0 : 7.05 31.79 1.99 0.00 0.00 3716772.00 1994.57 6302397.66 00:14:07.711 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x100 00:14:07.711 raid1 : 6.93 50.79 3.17 0.00 0.00 1834494.77 2065.81 4347487.28 00:14:07.711 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x100 length 0x100 00:14:07.711 raid1 : 7.08 31.66 1.98 0.00 0.00 3615218.17 2564.45 6068975.53 00:14:07.711 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x0 length 0x4e 00:14:07.711 AIO0 : 7.02 65.26 4.08 0.00 0.00 852587.41 854.82 2801065.63 00:14:07.711 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:14:07.711 Verification LBA range: start 0x4e length 0x4e 00:14:07.711 AIO0 : 7.05 24.68 1.54 0.00 0.00 2709713.30 1025.78 4347487.28 00:14:07.711 =================================================================================================================== 00:14:07.711 Total : 1270.26 79.39 0.00 0.00 1618459.80 626.87 6302397.66 00:14:07.711 00:14:07.711 real 0m8.425s 00:14:07.711 user 0m15.699s 00:14:07.711 sys 0m0.523s 00:14:07.711 17:08:02 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:07.711 17:08:02 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:14:07.711 ************************************ 00:14:07.711 END TEST bdev_verify_big_io 00:14:07.711 ************************************ 00:14:07.711 17:08:02 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:14:07.711 17:08:02 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:14:07.711 17:08:02 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:14:07.711 17:08:02 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:07.711 17:08:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:07.711 ************************************ 00:14:07.711 START TEST bdev_write_zeroes 00:14:07.711 ************************************ 00:14:07.711 17:08:02 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:14:07.711 [2024-07-23 17:08:02.525375] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:07.711 [2024-07-23 17:08:02.525436] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4098458 ] 00:14:07.711 [2024-07-23 17:08:02.657647] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:07.711 [2024-07-23 17:08:02.711589] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:07.711 [2024-07-23 17:08:02.864969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:14:07.711 [2024-07-23 17:08:02.865020] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:14:07.711 [2024-07-23 17:08:02.865035] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:14:07.711 [2024-07-23 17:08:02.872972] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:14:07.711 [2024-07-23 17:08:02.873000] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:14:07.711 [2024-07-23 17:08:02.880983] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:14:07.711 [2024-07-23 17:08:02.881008] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:14:07.711 [2024-07-23 17:08:02.958251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:14:07.711 [2024-07-23 17:08:02.958303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:07.711 [2024-07-23 17:08:02.958325] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2826d30 00:14:07.711 [2024-07-23 17:08:02.958338] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:07.711 [2024-07-23 17:08:02.959949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:07.711 [2024-07-23 17:08:02.959978] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:14:07.971 Running I/O for 1 seconds... 00:14:08.907 00:14:08.907 Latency(us) 00:14:08.907 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:08.907 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc0 : 1.04 4919.42 19.22 0.00 0.00 26007.71 644.67 43082.80 00:14:08.907 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc1p0 : 1.04 4912.34 19.19 0.00 0.00 25997.86 908.24 42398.94 00:14:08.907 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc1p1 : 1.04 4905.31 19.16 0.00 0.00 25982.86 894.00 41487.14 00:14:08.907 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p0 : 1.05 4898.24 19.13 0.00 0.00 25964.75 901.12 40575.33 00:14:08.907 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p1 : 1.05 4891.22 19.11 0.00 0.00 25947.24 897.56 39663.53 00:14:08.907 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p2 : 1.05 4884.28 19.08 0.00 0.00 25928.68 901.12 38751.72 00:14:08.907 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p3 : 1.05 4877.26 19.05 0.00 0.00 25905.19 897.56 37839.92 00:14:08.907 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p4 : 1.05 4870.34 19.02 0.00 0.00 25882.15 901.12 36928.11 00:14:08.907 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p5 : 1.05 4863.40 19.00 0.00 0.00 25861.87 901.12 36016.31 00:14:08.907 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p6 : 1.05 4856.46 18.97 0.00 0.00 25842.33 890.43 35104.50 00:14:08.907 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 Malloc2p7 : 1.06 4849.57 18.94 0.00 0.00 25827.36 901.12 34420.65 00:14:08.907 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 TestPT : 1.06 4842.74 18.92 0.00 0.00 25805.79 918.93 33508.84 00:14:08.907 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 raid0 : 1.06 4834.81 18.89 0.00 0.00 25774.19 1631.28 31685.23 00:14:08.907 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.907 concat0 : 1.06 4827.00 18.86 0.00 0.00 25716.47 1617.03 30089.57 00:14:08.908 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.908 raid1 : 1.06 4817.35 18.82 0.00 0.00 25656.29 2592.95 27468.13 00:14:08.908 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:14:08.908 AIO0 : 1.07 4924.25 19.24 0.00 0.00 24975.12 487.96 26442.35 00:14:08.908 =================================================================================================================== 00:14:08.908 Total : 77974.00 304.59 0.00 0.00 25815.93 487.96 43082.80 00:14:09.475 00:14:09.475 real 0m2.181s 00:14:09.475 user 0m1.756s 00:14:09.475 sys 0m0.354s 00:14:09.475 17:08:04 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:09.475 17:08:04 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:14:09.475 ************************************ 00:14:09.475 END TEST bdev_write_zeroes 00:14:09.475 ************************************ 00:14:09.475 17:08:04 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:14:09.475 17:08:04 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:14:09.476 17:08:04 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:14:09.476 17:08:04 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:09.476 17:08:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:09.476 ************************************ 00:14:09.476 START TEST bdev_json_nonenclosed 00:14:09.476 ************************************ 00:14:09.476 17:08:04 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:14:09.476 [2024-07-23 17:08:04.791835] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:09.476 [2024-07-23 17:08:04.791884] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4098824 ] 00:14:09.735 [2024-07-23 17:08:04.907029] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:09.735 [2024-07-23 17:08:04.965252] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:09.735 [2024-07-23 17:08:04.965325] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:14:09.735 [2024-07-23 17:08:04.965343] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:14:09.735 [2024-07-23 17:08:04.965355] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:14:09.735 00:14:09.735 real 0m0.319s 00:14:09.735 user 0m0.174s 00:14:09.735 sys 0m0.143s 00:14:09.735 17:08:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:14:09.735 17:08:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:09.735 17:08:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:14:09.735 ************************************ 00:14:09.735 END TEST bdev_json_nonenclosed 00:14:09.735 ************************************ 00:14:09.735 17:08:05 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:14:09.735 17:08:05 blockdev_general -- bdev/blockdev.sh@781 -- # true 00:14:09.735 17:08:05 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:14:09.735 17:08:05 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:14:09.735 17:08:05 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:09.735 17:08:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:09.735 ************************************ 00:14:09.735 START TEST bdev_json_nonarray 00:14:09.735 ************************************ 00:14:09.735 17:08:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:14:09.994 [2024-07-23 17:08:05.202212] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:09.994 [2024-07-23 17:08:05.202263] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4098846 ] 00:14:09.994 [2024-07-23 17:08:05.317169] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:09.994 [2024-07-23 17:08:05.375098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:09.994 [2024-07-23 17:08:05.375183] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:14:09.994 [2024-07-23 17:08:05.375202] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:14:09.994 [2024-07-23 17:08:05.375215] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:14:10.252 00:14:10.252 real 0m0.320s 00:14:10.252 user 0m0.173s 00:14:10.252 sys 0m0.144s 00:14:10.252 17:08:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:14:10.252 17:08:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:10.252 17:08:05 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:14:10.252 ************************************ 00:14:10.252 END TEST bdev_json_nonarray 00:14:10.252 ************************************ 00:14:10.252 17:08:05 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:14:10.252 17:08:05 blockdev_general -- bdev/blockdev.sh@784 -- # true 00:14:10.252 17:08:05 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:14:10.252 17:08:05 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:14:10.252 17:08:05 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:14:10.252 17:08:05 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:10.252 17:08:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:10.252 ************************************ 00:14:10.252 START TEST bdev_qos 00:14:10.252 ************************************ 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=4098904 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 4098904' 00:14:10.252 Process qos testing pid: 4098904 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 4098904 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 4098904 ']' 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:10.252 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:10.252 17:08:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.252 [2024-07-23 17:08:05.622044] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:10.252 [2024-07-23 17:08:05.622108] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4098904 ] 00:14:10.511 [2024-07-23 17:08:05.762577] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:10.511 [2024-07-23 17:08:05.826529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.770 Malloc_0 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:10.770 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.770 [ 00:14:10.770 { 00:14:10.770 "name": "Malloc_0", 00:14:10.770 "aliases": [ 00:14:10.770 "883804be-402c-47ed-9489-381d95bd9ada" 00:14:10.770 ], 00:14:10.770 "product_name": "Malloc disk", 00:14:10.770 "block_size": 512, 00:14:10.770 "num_blocks": 262144, 00:14:10.770 "uuid": "883804be-402c-47ed-9489-381d95bd9ada", 00:14:10.770 "assigned_rate_limits": { 00:14:10.770 "rw_ios_per_sec": 0, 00:14:10.770 "rw_mbytes_per_sec": 0, 00:14:10.770 "r_mbytes_per_sec": 0, 00:14:10.770 "w_mbytes_per_sec": 0 00:14:10.770 }, 00:14:10.770 "claimed": false, 00:14:10.770 "zoned": false, 00:14:10.770 "supported_io_types": { 00:14:10.770 "read": true, 00:14:10.770 "write": true, 00:14:10.770 "unmap": true, 00:14:10.770 "flush": true, 00:14:10.770 "reset": true, 00:14:10.770 "nvme_admin": false, 00:14:10.770 "nvme_io": false, 00:14:10.770 "nvme_io_md": false, 00:14:10.770 "write_zeroes": true, 00:14:10.770 "zcopy": true, 00:14:10.770 "get_zone_info": false, 00:14:10.770 "zone_management": false, 00:14:10.770 "zone_append": false, 00:14:10.770 "compare": false, 00:14:10.770 "compare_and_write": false, 00:14:10.770 "abort": true, 00:14:10.770 "seek_hole": false, 00:14:10.770 "seek_data": false, 00:14:10.770 "copy": true, 00:14:10.770 "nvme_iov_md": false 00:14:10.770 }, 00:14:10.770 "memory_domains": [ 00:14:10.770 { 00:14:10.770 "dma_device_id": "system", 00:14:10.770 "dma_device_type": 1 00:14:10.770 }, 00:14:10.770 { 00:14:10.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.770 "dma_device_type": 2 00:14:10.770 } 00:14:10.770 ], 00:14:10.770 "driver_specific": {} 00:14:10.770 } 00:14:10.770 ] 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.771 Null_1 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:10.771 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:10.771 [ 00:14:10.771 { 00:14:10.771 "name": "Null_1", 00:14:10.771 "aliases": [ 00:14:10.771 "89cbed80-83b6-4623-90e2-66a39251d9f1" 00:14:10.771 ], 00:14:10.771 "product_name": "Null disk", 00:14:10.771 "block_size": 512, 00:14:10.771 "num_blocks": 262144, 00:14:10.771 "uuid": "89cbed80-83b6-4623-90e2-66a39251d9f1", 00:14:10.771 "assigned_rate_limits": { 00:14:10.771 "rw_ios_per_sec": 0, 00:14:11.030 "rw_mbytes_per_sec": 0, 00:14:11.030 "r_mbytes_per_sec": 0, 00:14:11.030 "w_mbytes_per_sec": 0 00:14:11.030 }, 00:14:11.030 "claimed": false, 00:14:11.030 "zoned": false, 00:14:11.030 "supported_io_types": { 00:14:11.030 "read": true, 00:14:11.030 "write": true, 00:14:11.030 "unmap": false, 00:14:11.030 "flush": false, 00:14:11.030 "reset": true, 00:14:11.030 "nvme_admin": false, 00:14:11.030 "nvme_io": false, 00:14:11.030 "nvme_io_md": false, 00:14:11.030 "write_zeroes": true, 00:14:11.030 "zcopy": false, 00:14:11.030 "get_zone_info": false, 00:14:11.030 "zone_management": false, 00:14:11.030 "zone_append": false, 00:14:11.030 "compare": false, 00:14:11.030 "compare_and_write": false, 00:14:11.030 "abort": true, 00:14:11.030 "seek_hole": false, 00:14:11.030 "seek_data": false, 00:14:11.030 "copy": false, 00:14:11.030 "nvme_iov_md": false 00:14:11.030 }, 00:14:11.030 "driver_specific": {} 00:14:11.030 } 00:14:11.030 ] 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:14:11.030 17:08:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:14:11.030 Running I/O for 60 seconds... 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 48747.68 194990.72 0.00 0.00 196608.00 0.00 0.00 ' 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=48747.68 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 48747 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=48747 00:14:16.381 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=12000 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 12000 -gt 1000 ']' 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:16.382 17:08:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:16.382 ************************************ 00:14:16.382 START TEST bdev_qos_iops 00:14:16.382 ************************************ 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 12000 IOPS Malloc_0 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=12000 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:14:16.382 17:08:11 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 11998.24 47992.96 0.00 0.00 49056.00 0.00 0.00 ' 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=11998.24 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 11998 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=11998 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=10800 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=13200 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 11998 -lt 10800 ']' 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 11998 -gt 13200 ']' 00:14:21.659 00:14:21.659 real 0m5.275s 00:14:21.659 user 0m0.121s 00:14:21.659 sys 0m0.047s 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:21.659 17:08:16 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:14:21.659 ************************************ 00:14:21.659 END TEST bdev_qos_iops 00:14:21.659 ************************************ 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:14:21.659 17:08:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 15515.56 62062.23 0.00 0.00 63488.00 0.00 0.00 ' 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=63488.00 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 63488 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=63488 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=6 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 6 -lt 2 ']' 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:26.939 17:08:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:26.939 ************************************ 00:14:26.939 START TEST bdev_qos_bw 00:14:26.939 ************************************ 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 6 BANDWIDTH Null_1 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=6 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:14:26.939 17:08:22 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 1536.16 6144.64 0.00 0.00 6320.00 0.00 0.00 ' 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=6320.00 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 6320 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=6320 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=6144 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=5529 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=6758 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 6320 -lt 5529 ']' 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 6320 -gt 6758 ']' 00:14:32.217 00:14:32.217 real 0m5.320s 00:14:32.217 user 0m0.119s 00:14:32.217 sys 0m0.047s 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:14:32.217 ************************************ 00:14:32.217 END TEST bdev_qos_bw 00:14:32.217 ************************************ 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:32.217 17:08:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:32.217 ************************************ 00:14:32.217 START TEST bdev_qos_ro_bw 00:14:32.217 ************************************ 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:14:32.217 17:08:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 512.28 2049.12 0.00 0.00 2056.00 0.00 0.00 ' 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2056.00 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2056 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2056 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2056 -lt 1843 ']' 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2056 -gt 2252 ']' 00:14:37.498 00:14:37.498 real 0m5.190s 00:14:37.498 user 0m0.117s 00:14:37.498 sys 0m0.051s 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:37.498 17:08:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:14:37.498 ************************************ 00:14:37.498 END TEST bdev_qos_ro_bw 00:14:37.498 ************************************ 00:14:37.498 17:08:32 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:14:37.498 17:08:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:14:37.498 17:08:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:37.498 17:08:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:38.068 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:38.068 17:08:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:14:38.068 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:38.068 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:38.328 00:14:38.328 Latency(us) 00:14:38.328 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:38.328 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:14:38.328 Malloc_0 : 26.97 16419.08 64.14 0.00 0.00 15447.29 2550.21 503316.48 00:14:38.328 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:14:38.328 Null_1 : 27.17 15828.16 61.83 0.00 0.00 16116.83 1004.41 198773.54 00:14:38.328 =================================================================================================================== 00:14:38.328 Total : 32247.24 125.97 0.00 0.00 15777.16 1004.41 503316.48 00:14:38.328 0 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 4098904 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 4098904 ']' 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 4098904 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4098904 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4098904' 00:14:38.328 killing process with pid 4098904 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 4098904 00:14:38.328 Received shutdown signal, test time was about 27.232195 seconds 00:14:38.328 00:14:38.328 Latency(us) 00:14:38.328 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:38.328 =================================================================================================================== 00:14:38.328 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:14:38.328 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 4098904 00:14:38.588 17:08:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:14:38.588 00:14:38.588 real 0m28.315s 00:14:38.588 user 0m29.060s 00:14:38.588 sys 0m0.941s 00:14:38.588 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:38.588 17:08:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:14:38.588 ************************************ 00:14:38.588 END TEST bdev_qos 00:14:38.589 ************************************ 00:14:38.589 17:08:33 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:14:38.589 17:08:33 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:14:38.589 17:08:33 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:14:38.589 17:08:33 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:38.589 17:08:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:38.589 ************************************ 00:14:38.589 START TEST bdev_qd_sampling 00:14:38.589 ************************************ 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=4102665 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 4102665' 00:14:38.589 Process bdev QD sampling period testing pid: 4102665 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 4102665 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 4102665 ']' 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:38.589 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:38.589 17:08:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:38.848 [2024-07-23 17:08:34.030006] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:38.849 [2024-07-23 17:08:34.030080] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4102665 ] 00:14:38.849 [2024-07-23 17:08:34.164056] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:14:38.849 [2024-07-23 17:08:34.219478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:14:38.849 [2024-07-23 17:08:34.219481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:39.787 17:08:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:39.787 17:08:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:14:39.787 17:08:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:14:39.787 17:08:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:39.787 17:08:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:39.787 Malloc_QD 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:39.787 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:39.787 [ 00:14:39.787 { 00:14:39.787 "name": "Malloc_QD", 00:14:39.787 "aliases": [ 00:14:39.787 "24ad5250-c643-4663-b5cc-087d236ca31f" 00:14:39.787 ], 00:14:39.787 "product_name": "Malloc disk", 00:14:39.787 "block_size": 512, 00:14:39.787 "num_blocks": 262144, 00:14:39.787 "uuid": "24ad5250-c643-4663-b5cc-087d236ca31f", 00:14:39.787 "assigned_rate_limits": { 00:14:39.787 "rw_ios_per_sec": 0, 00:14:39.787 "rw_mbytes_per_sec": 0, 00:14:39.787 "r_mbytes_per_sec": 0, 00:14:39.787 "w_mbytes_per_sec": 0 00:14:39.787 }, 00:14:39.787 "claimed": false, 00:14:39.787 "zoned": false, 00:14:39.787 "supported_io_types": { 00:14:39.787 "read": true, 00:14:39.787 "write": true, 00:14:39.787 "unmap": true, 00:14:39.787 "flush": true, 00:14:39.787 "reset": true, 00:14:39.787 "nvme_admin": false, 00:14:39.787 "nvme_io": false, 00:14:39.787 "nvme_io_md": false, 00:14:39.787 "write_zeroes": true, 00:14:39.787 "zcopy": true, 00:14:39.787 "get_zone_info": false, 00:14:39.787 "zone_management": false, 00:14:39.787 "zone_append": false, 00:14:39.787 "compare": false, 00:14:39.787 "compare_and_write": false, 00:14:39.787 "abort": true, 00:14:39.788 "seek_hole": false, 00:14:39.788 "seek_data": false, 00:14:39.788 "copy": true, 00:14:39.788 "nvme_iov_md": false 00:14:39.788 }, 00:14:39.788 "memory_domains": [ 00:14:39.788 { 00:14:39.788 "dma_device_id": "system", 00:14:39.788 "dma_device_type": 1 00:14:39.788 }, 00:14:39.788 { 00:14:39.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.788 "dma_device_type": 2 00:14:39.788 } 00:14:39.788 ], 00:14:39.788 "driver_specific": {} 00:14:39.788 } 00:14:39.788 ] 00:14:39.788 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:39.788 17:08:35 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:14:39.788 17:08:35 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:14:39.788 17:08:35 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:14:39.788 Running I/O for 5 seconds... 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:14:41.709 "tick_rate": 2300000000, 00:14:41.709 "ticks": 6975778089483612, 00:14:41.709 "bdevs": [ 00:14:41.709 { 00:14:41.709 "name": "Malloc_QD", 00:14:41.709 "bytes_read": 674279936, 00:14:41.709 "num_read_ops": 164612, 00:14:41.709 "bytes_written": 0, 00:14:41.709 "num_write_ops": 0, 00:14:41.709 "bytes_unmapped": 0, 00:14:41.709 "num_unmap_ops": 0, 00:14:41.709 "bytes_copied": 0, 00:14:41.709 "num_copy_ops": 0, 00:14:41.709 "read_latency_ticks": 2244363631754, 00:14:41.709 "max_read_latency_ticks": 18096436, 00:14:41.709 "min_read_latency_ticks": 285558, 00:14:41.709 "write_latency_ticks": 0, 00:14:41.709 "max_write_latency_ticks": 0, 00:14:41.709 "min_write_latency_ticks": 0, 00:14:41.709 "unmap_latency_ticks": 0, 00:14:41.709 "max_unmap_latency_ticks": 0, 00:14:41.709 "min_unmap_latency_ticks": 0, 00:14:41.709 "copy_latency_ticks": 0, 00:14:41.709 "max_copy_latency_ticks": 0, 00:14:41.709 "min_copy_latency_ticks": 0, 00:14:41.709 "io_error": {}, 00:14:41.709 "queue_depth_polling_period": 10, 00:14:41.709 "queue_depth": 768, 00:14:41.709 "io_time": 30, 00:14:41.709 "weighted_io_time": 17920 00:14:41.709 } 00:14:41.709 ] 00:14:41.709 }' 00:14:41.709 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:41.980 00:14:41.980 Latency(us) 00:14:41.980 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:41.980 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:14:41.980 Malloc_QD : 1.99 47283.87 184.70 0.00 0.00 5400.51 1424.70 5784.26 00:14:41.980 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:14:41.980 Malloc_QD : 2.00 38995.02 152.32 0.00 0.00 6547.43 1253.73 7921.31 00:14:41.980 =================================================================================================================== 00:14:41.980 Total : 86278.89 337.03 0.00 0.00 5919.35 1253.73 7921.31 00:14:41.980 0 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 4102665 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 4102665 ']' 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 4102665 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4102665 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4102665' 00:14:41.980 killing process with pid 4102665 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 4102665 00:14:41.980 Received shutdown signal, test time was about 2.077079 seconds 00:14:41.980 00:14:41.980 Latency(us) 00:14:41.980 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:41.980 =================================================================================================================== 00:14:41.980 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:14:41.980 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 4102665 00:14:42.240 17:08:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:14:42.240 00:14:42.240 real 0m3.491s 00:14:42.240 user 0m6.896s 00:14:42.240 sys 0m0.455s 00:14:42.240 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:42.240 17:08:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:14:42.240 ************************************ 00:14:42.240 END TEST bdev_qd_sampling 00:14:42.240 ************************************ 00:14:42.240 17:08:37 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:14:42.240 17:08:37 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:14:42.240 17:08:37 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:14:42.241 17:08:37 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:42.241 17:08:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:42.241 ************************************ 00:14:42.241 START TEST bdev_error 00:14:42.241 ************************************ 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=4103215 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 4103215' 00:14:42.241 Process error testing pid: 4103215 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:14:42.241 17:08:37 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 4103215 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 4103215 ']' 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:42.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:42.241 17:08:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:42.241 [2024-07-23 17:08:37.604842] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:42.241 [2024-07-23 17:08:37.604918] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4103215 ] 00:14:42.500 [2024-07-23 17:08:37.741695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:42.500 [2024-07-23 17:08:37.798087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:14:43.438 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.438 Dev_1 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.438 [ 00:14:43.438 { 00:14:43.438 "name": "Dev_1", 00:14:43.438 "aliases": [ 00:14:43.438 "35adc290-968a-4601-8be7-27f0441fa059" 00:14:43.438 ], 00:14:43.438 "product_name": "Malloc disk", 00:14:43.438 "block_size": 512, 00:14:43.438 "num_blocks": 262144, 00:14:43.438 "uuid": "35adc290-968a-4601-8be7-27f0441fa059", 00:14:43.438 "assigned_rate_limits": { 00:14:43.438 "rw_ios_per_sec": 0, 00:14:43.438 "rw_mbytes_per_sec": 0, 00:14:43.438 "r_mbytes_per_sec": 0, 00:14:43.438 "w_mbytes_per_sec": 0 00:14:43.438 }, 00:14:43.438 "claimed": false, 00:14:43.438 "zoned": false, 00:14:43.438 "supported_io_types": { 00:14:43.438 "read": true, 00:14:43.438 "write": true, 00:14:43.438 "unmap": true, 00:14:43.438 "flush": true, 00:14:43.438 "reset": true, 00:14:43.438 "nvme_admin": false, 00:14:43.438 "nvme_io": false, 00:14:43.438 "nvme_io_md": false, 00:14:43.438 "write_zeroes": true, 00:14:43.438 "zcopy": true, 00:14:43.438 "get_zone_info": false, 00:14:43.438 "zone_management": false, 00:14:43.438 "zone_append": false, 00:14:43.438 "compare": false, 00:14:43.438 "compare_and_write": false, 00:14:43.438 "abort": true, 00:14:43.438 "seek_hole": false, 00:14:43.438 "seek_data": false, 00:14:43.438 "copy": true, 00:14:43.438 "nvme_iov_md": false 00:14:43.438 }, 00:14:43.438 "memory_domains": [ 00:14:43.438 { 00:14:43.438 "dma_device_id": "system", 00:14:43.438 "dma_device_type": 1 00:14:43.438 }, 00:14:43.438 { 00:14:43.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.438 "dma_device_type": 2 00:14:43.438 } 00:14:43.438 ], 00:14:43.438 "driver_specific": {} 00:14:43.438 } 00:14:43.438 ] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:14:43.438 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.438 true 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.438 Dev_2 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:43.438 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.439 [ 00:14:43.439 { 00:14:43.439 "name": "Dev_2", 00:14:43.439 "aliases": [ 00:14:43.439 "b04c4e27-d887-4f0b-bebd-5269f546edc5" 00:14:43.439 ], 00:14:43.439 "product_name": "Malloc disk", 00:14:43.439 "block_size": 512, 00:14:43.439 "num_blocks": 262144, 00:14:43.439 "uuid": "b04c4e27-d887-4f0b-bebd-5269f546edc5", 00:14:43.439 "assigned_rate_limits": { 00:14:43.439 "rw_ios_per_sec": 0, 00:14:43.439 "rw_mbytes_per_sec": 0, 00:14:43.439 "r_mbytes_per_sec": 0, 00:14:43.439 "w_mbytes_per_sec": 0 00:14:43.439 }, 00:14:43.439 "claimed": false, 00:14:43.439 "zoned": false, 00:14:43.439 "supported_io_types": { 00:14:43.439 "read": true, 00:14:43.439 "write": true, 00:14:43.439 "unmap": true, 00:14:43.439 "flush": true, 00:14:43.439 "reset": true, 00:14:43.439 "nvme_admin": false, 00:14:43.439 "nvme_io": false, 00:14:43.439 "nvme_io_md": false, 00:14:43.439 "write_zeroes": true, 00:14:43.439 "zcopy": true, 00:14:43.439 "get_zone_info": false, 00:14:43.439 "zone_management": false, 00:14:43.439 "zone_append": false, 00:14:43.439 "compare": false, 00:14:43.439 "compare_and_write": false, 00:14:43.439 "abort": true, 00:14:43.439 "seek_hole": false, 00:14:43.439 "seek_data": false, 00:14:43.439 "copy": true, 00:14:43.439 "nvme_iov_md": false 00:14:43.439 }, 00:14:43.439 "memory_domains": [ 00:14:43.439 { 00:14:43.439 "dma_device_id": "system", 00:14:43.439 "dma_device_type": 1 00:14:43.439 }, 00:14:43.439 { 00:14:43.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.439 "dma_device_type": 2 00:14:43.439 } 00:14:43.439 ], 00:14:43.439 "driver_specific": {} 00:14:43.439 } 00:14:43.439 ] 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:14:43.439 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:43.439 17:08:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:43.439 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:14:43.439 17:08:38 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:14:43.439 Running I/O for 5 seconds... 00:14:44.377 17:08:39 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 4103215 00:14:44.377 17:08:39 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 4103215' 00:14:44.377 Process is existed as continue on error is set. Pid: 4103215 00:14:44.377 17:08:39 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:14:44.377 17:08:39 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:44.377 17:08:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:44.377 17:08:39 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:44.377 17:08:39 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:14:44.377 17:08:39 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:44.377 17:08:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:44.377 17:08:39 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:44.377 17:08:39 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:14:44.637 Timeout while waiting for response: 00:14:44.637 00:14:44.637 00:14:48.833 00:14:48.833 Latency(us) 00:14:48.833 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:48.833 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:14:48.833 EE_Dev_1 : 0.89 28906.52 112.92 5.59 0.00 548.95 164.73 861.94 00:14:48.833 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:14:48.833 Dev_2 : 5.00 62216.36 243.03 0.00 0.00 252.64 86.37 31001.38 00:14:48.833 =================================================================================================================== 00:14:48.833 Total : 91122.88 355.95 5.59 0.00 275.36 86.37 31001.38 00:14:49.402 17:08:44 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 4103215 00:14:49.402 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 4103215 ']' 00:14:49.402 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 4103215 00:14:49.402 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:14:49.402 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:49.402 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4103215 00:14:49.402 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:14:49.662 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:14:49.662 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4103215' 00:14:49.662 killing process with pid 4103215 00:14:49.662 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 4103215 00:14:49.662 Received shutdown signal, test time was about 5.000000 seconds 00:14:49.662 00:14:49.662 Latency(us) 00:14:49.662 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:49.662 =================================================================================================================== 00:14:49.662 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:14:49.662 17:08:44 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 4103215 00:14:49.921 17:08:45 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=4104118 00:14:49.921 17:08:45 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 4104118' 00:14:49.921 Process error testing pid: 4104118 00:14:49.921 17:08:45 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:14:49.921 17:08:45 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 4104118 00:14:49.922 17:08:45 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 4104118 ']' 00:14:49.922 17:08:45 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:49.922 17:08:45 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:49.922 17:08:45 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:49.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:49.922 17:08:45 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:49.922 17:08:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:49.922 [2024-07-23 17:08:45.225156] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:49.922 [2024-07-23 17:08:45.225231] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4104118 ] 00:14:50.181 [2024-07-23 17:08:45.364412] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:50.181 [2024-07-23 17:08:45.424189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:14:50.750 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:50.750 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:14:50.750 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:14:50.750 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:50.750 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 Dev_1 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 [ 00:14:51.010 { 00:14:51.010 "name": "Dev_1", 00:14:51.010 "aliases": [ 00:14:51.010 "ee72906c-d6ab-4b41-b5d9-7a90a40b41e2" 00:14:51.010 ], 00:14:51.010 "product_name": "Malloc disk", 00:14:51.010 "block_size": 512, 00:14:51.010 "num_blocks": 262144, 00:14:51.010 "uuid": "ee72906c-d6ab-4b41-b5d9-7a90a40b41e2", 00:14:51.010 "assigned_rate_limits": { 00:14:51.010 "rw_ios_per_sec": 0, 00:14:51.010 "rw_mbytes_per_sec": 0, 00:14:51.010 "r_mbytes_per_sec": 0, 00:14:51.010 "w_mbytes_per_sec": 0 00:14:51.010 }, 00:14:51.010 "claimed": false, 00:14:51.010 "zoned": false, 00:14:51.010 "supported_io_types": { 00:14:51.010 "read": true, 00:14:51.010 "write": true, 00:14:51.010 "unmap": true, 00:14:51.010 "flush": true, 00:14:51.010 "reset": true, 00:14:51.010 "nvme_admin": false, 00:14:51.010 "nvme_io": false, 00:14:51.010 "nvme_io_md": false, 00:14:51.010 "write_zeroes": true, 00:14:51.010 "zcopy": true, 00:14:51.010 "get_zone_info": false, 00:14:51.010 "zone_management": false, 00:14:51.010 "zone_append": false, 00:14:51.010 "compare": false, 00:14:51.010 "compare_and_write": false, 00:14:51.010 "abort": true, 00:14:51.010 "seek_hole": false, 00:14:51.010 "seek_data": false, 00:14:51.010 "copy": true, 00:14:51.010 "nvme_iov_md": false 00:14:51.010 }, 00:14:51.010 "memory_domains": [ 00:14:51.010 { 00:14:51.010 "dma_device_id": "system", 00:14:51.010 "dma_device_type": 1 00:14:51.010 }, 00:14:51.010 { 00:14:51.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.010 "dma_device_type": 2 00:14:51.010 } 00:14:51.010 ], 00:14:51.010 "driver_specific": {} 00:14:51.010 } 00:14:51.010 ] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:14:51.010 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 true 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 Dev_2 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.010 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.010 [ 00:14:51.010 { 00:14:51.010 "name": "Dev_2", 00:14:51.010 "aliases": [ 00:14:51.010 "a93f97c7-3b56-44e6-9337-808134317d59" 00:14:51.010 ], 00:14:51.010 "product_name": "Malloc disk", 00:14:51.010 "block_size": 512, 00:14:51.011 "num_blocks": 262144, 00:14:51.011 "uuid": "a93f97c7-3b56-44e6-9337-808134317d59", 00:14:51.011 "assigned_rate_limits": { 00:14:51.011 "rw_ios_per_sec": 0, 00:14:51.011 "rw_mbytes_per_sec": 0, 00:14:51.011 "r_mbytes_per_sec": 0, 00:14:51.011 "w_mbytes_per_sec": 0 00:14:51.011 }, 00:14:51.011 "claimed": false, 00:14:51.011 "zoned": false, 00:14:51.011 "supported_io_types": { 00:14:51.011 "read": true, 00:14:51.011 "write": true, 00:14:51.011 "unmap": true, 00:14:51.011 "flush": true, 00:14:51.011 "reset": true, 00:14:51.011 "nvme_admin": false, 00:14:51.011 "nvme_io": false, 00:14:51.011 "nvme_io_md": false, 00:14:51.011 "write_zeroes": true, 00:14:51.011 "zcopy": true, 00:14:51.011 "get_zone_info": false, 00:14:51.011 "zone_management": false, 00:14:51.011 "zone_append": false, 00:14:51.011 "compare": false, 00:14:51.011 "compare_and_write": false, 00:14:51.011 "abort": true, 00:14:51.011 "seek_hole": false, 00:14:51.011 "seek_data": false, 00:14:51.011 "copy": true, 00:14:51.011 "nvme_iov_md": false 00:14:51.011 }, 00:14:51.011 "memory_domains": [ 00:14:51.011 { 00:14:51.011 "dma_device_id": "system", 00:14:51.011 "dma_device_type": 1 00:14:51.011 }, 00:14:51.011 { 00:14:51.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.011 "dma_device_type": 2 00:14:51.011 } 00:14:51.011 ], 00:14:51.011 "driver_specific": {} 00:14:51.011 } 00:14:51.011 ] 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:14:51.011 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:51.011 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 4104118 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 4104118 00:14:51.011 17:08:46 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:51.011 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 4104118 00:14:51.270 Running I/O for 5 seconds... 00:14:51.270 task offset: 105064 on job bdev=EE_Dev_1 fails 00:14:51.270 00:14:51.270 Latency(us) 00:14:51.270 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:51.270 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:14:51.270 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:14:51.270 EE_Dev_1 : 0.00 23084.99 90.18 5246.59 0.00 472.34 170.07 837.01 00:14:51.270 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:14:51.270 Dev_2 : 0.00 14279.34 55.78 0.00 0.00 831.17 164.73 1545.79 00:14:51.270 =================================================================================================================== 00:14:51.270 Total : 37364.33 145.95 5246.59 0.00 666.96 164.73 1545.79 00:14:51.270 [2024-07-23 17:08:46.459173] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:14:51.270 request: 00:14:51.270 { 00:14:51.270 "method": "perform_tests", 00:14:51.270 "req_id": 1 00:14:51.270 } 00:14:51.270 Got JSON-RPC error response 00:14:51.270 response: 00:14:51.270 { 00:14:51.270 "code": -32603, 00:14:51.270 "message": "bdevperf failed with error Operation not permitted" 00:14:51.270 } 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:51.530 00:14:51.530 real 0m9.285s 00:14:51.530 user 0m9.634s 00:14:51.530 sys 0m0.942s 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:51.530 17:08:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:14:51.530 ************************************ 00:14:51.530 END TEST bdev_error 00:14:51.530 ************************************ 00:14:51.530 17:08:46 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:14:51.530 17:08:46 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:14:51.530 17:08:46 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:14:51.530 17:08:46 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:51.530 17:08:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:51.530 ************************************ 00:14:51.530 START TEST bdev_stat 00:14:51.530 ************************************ 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=4104474 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 4104474' 00:14:51.530 Process Bdev IO statistics testing pid: 4104474 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 4104474 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 4104474 ']' 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:51.530 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:51.530 17:08:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:51.789 [2024-07-23 17:08:46.978522] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:51.789 [2024-07-23 17:08:46.978590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4104474 ] 00:14:51.789 [2024-07-23 17:08:47.103020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:14:51.789 [2024-07-23 17:08:47.157924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:14:51.789 [2024-07-23 17:08:47.157930] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:52.049 Malloc_STAT 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:52.049 [ 00:14:52.049 { 00:14:52.049 "name": "Malloc_STAT", 00:14:52.049 "aliases": [ 00:14:52.049 "ef9c1af8-b319-4032-ad3d-4559cd53d9ae" 00:14:52.049 ], 00:14:52.049 "product_name": "Malloc disk", 00:14:52.049 "block_size": 512, 00:14:52.049 "num_blocks": 262144, 00:14:52.049 "uuid": "ef9c1af8-b319-4032-ad3d-4559cd53d9ae", 00:14:52.049 "assigned_rate_limits": { 00:14:52.049 "rw_ios_per_sec": 0, 00:14:52.049 "rw_mbytes_per_sec": 0, 00:14:52.049 "r_mbytes_per_sec": 0, 00:14:52.049 "w_mbytes_per_sec": 0 00:14:52.049 }, 00:14:52.049 "claimed": false, 00:14:52.049 "zoned": false, 00:14:52.049 "supported_io_types": { 00:14:52.049 "read": true, 00:14:52.049 "write": true, 00:14:52.049 "unmap": true, 00:14:52.049 "flush": true, 00:14:52.049 "reset": true, 00:14:52.049 "nvme_admin": false, 00:14:52.049 "nvme_io": false, 00:14:52.049 "nvme_io_md": false, 00:14:52.049 "write_zeroes": true, 00:14:52.049 "zcopy": true, 00:14:52.049 "get_zone_info": false, 00:14:52.049 "zone_management": false, 00:14:52.049 "zone_append": false, 00:14:52.049 "compare": false, 00:14:52.049 "compare_and_write": false, 00:14:52.049 "abort": true, 00:14:52.049 "seek_hole": false, 00:14:52.049 "seek_data": false, 00:14:52.049 "copy": true, 00:14:52.049 "nvme_iov_md": false 00:14:52.049 }, 00:14:52.049 "memory_domains": [ 00:14:52.049 { 00:14:52.049 "dma_device_id": "system", 00:14:52.049 "dma_device_type": 1 00:14:52.049 }, 00:14:52.049 { 00:14:52.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.049 "dma_device_type": 2 00:14:52.049 } 00:14:52.049 ], 00:14:52.049 "driver_specific": {} 00:14:52.049 } 00:14:52.049 ] 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:14:52.049 17:08:47 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:14:52.049 Running I/O for 10 seconds... 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:53.955 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:14:54.214 "tick_rate": 2300000000, 00:14:54.214 "ticks": 6975806294167108, 00:14:54.214 "bdevs": [ 00:14:54.214 { 00:14:54.214 "name": "Malloc_STAT", 00:14:54.214 "bytes_read": 667988480, 00:14:54.214 "num_read_ops": 163076, 00:14:54.214 "bytes_written": 0, 00:14:54.214 "num_write_ops": 0, 00:14:54.214 "bytes_unmapped": 0, 00:14:54.214 "num_unmap_ops": 0, 00:14:54.214 "bytes_copied": 0, 00:14:54.214 "num_copy_ops": 0, 00:14:54.214 "read_latency_ticks": 2229549451996, 00:14:54.214 "max_read_latency_ticks": 17829592, 00:14:54.214 "min_read_latency_ticks": 244940, 00:14:54.214 "write_latency_ticks": 0, 00:14:54.214 "max_write_latency_ticks": 0, 00:14:54.214 "min_write_latency_ticks": 0, 00:14:54.214 "unmap_latency_ticks": 0, 00:14:54.214 "max_unmap_latency_ticks": 0, 00:14:54.214 "min_unmap_latency_ticks": 0, 00:14:54.214 "copy_latency_ticks": 0, 00:14:54.214 "max_copy_latency_ticks": 0, 00:14:54.214 "min_copy_latency_ticks": 0, 00:14:54.214 "io_error": {} 00:14:54.214 } 00:14:54.214 ] 00:14:54.214 }' 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=163076 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:54.214 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:14:54.214 "tick_rate": 2300000000, 00:14:54.214 "ticks": 6975806470831994, 00:14:54.214 "name": "Malloc_STAT", 00:14:54.214 "channels": [ 00:14:54.214 { 00:14:54.214 "thread_id": 2, 00:14:54.214 "bytes_read": 378535936, 00:14:54.214 "num_read_ops": 92416, 00:14:54.214 "bytes_written": 0, 00:14:54.214 "num_write_ops": 0, 00:14:54.215 "bytes_unmapped": 0, 00:14:54.215 "num_unmap_ops": 0, 00:14:54.215 "bytes_copied": 0, 00:14:54.215 "num_copy_ops": 0, 00:14:54.215 "read_latency_ticks": 1158391740894, 00:14:54.215 "max_read_latency_ticks": 13246758, 00:14:54.215 "min_read_latency_ticks": 8444360, 00:14:54.215 "write_latency_ticks": 0, 00:14:54.215 "max_write_latency_ticks": 0, 00:14:54.215 "min_write_latency_ticks": 0, 00:14:54.215 "unmap_latency_ticks": 0, 00:14:54.215 "max_unmap_latency_ticks": 0, 00:14:54.215 "min_unmap_latency_ticks": 0, 00:14:54.215 "copy_latency_ticks": 0, 00:14:54.215 "max_copy_latency_ticks": 0, 00:14:54.215 "min_copy_latency_ticks": 0 00:14:54.215 }, 00:14:54.215 { 00:14:54.215 "thread_id": 3, 00:14:54.215 "bytes_read": 315621376, 00:14:54.215 "num_read_ops": 77056, 00:14:54.215 "bytes_written": 0, 00:14:54.215 "num_write_ops": 0, 00:14:54.215 "bytes_unmapped": 0, 00:14:54.215 "num_unmap_ops": 0, 00:14:54.215 "bytes_copied": 0, 00:14:54.215 "num_copy_ops": 0, 00:14:54.215 "read_latency_ticks": 1158649011366, 00:14:54.215 "max_read_latency_ticks": 17829592, 00:14:54.215 "min_read_latency_ticks": 9711590, 00:14:54.215 "write_latency_ticks": 0, 00:14:54.215 "max_write_latency_ticks": 0, 00:14:54.215 "min_write_latency_ticks": 0, 00:14:54.215 "unmap_latency_ticks": 0, 00:14:54.215 "max_unmap_latency_ticks": 0, 00:14:54.215 "min_unmap_latency_ticks": 0, 00:14:54.215 "copy_latency_ticks": 0, 00:14:54.215 "max_copy_latency_ticks": 0, 00:14:54.215 "min_copy_latency_ticks": 0 00:14:54.215 } 00:14:54.215 ] 00:14:54.215 }' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=92416 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=92416 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=77056 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=169472 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:14:54.215 "tick_rate": 2300000000, 00:14:54.215 "ticks": 6975806759813068, 00:14:54.215 "bdevs": [ 00:14:54.215 { 00:14:54.215 "name": "Malloc_STAT", 00:14:54.215 "bytes_read": 738243072, 00:14:54.215 "num_read_ops": 180228, 00:14:54.215 "bytes_written": 0, 00:14:54.215 "num_write_ops": 0, 00:14:54.215 "bytes_unmapped": 0, 00:14:54.215 "num_unmap_ops": 0, 00:14:54.215 "bytes_copied": 0, 00:14:54.215 "num_copy_ops": 0, 00:14:54.215 "read_latency_ticks": 2464299027566, 00:14:54.215 "max_read_latency_ticks": 17829592, 00:14:54.215 "min_read_latency_ticks": 244940, 00:14:54.215 "write_latency_ticks": 0, 00:14:54.215 "max_write_latency_ticks": 0, 00:14:54.215 "min_write_latency_ticks": 0, 00:14:54.215 "unmap_latency_ticks": 0, 00:14:54.215 "max_unmap_latency_ticks": 0, 00:14:54.215 "min_unmap_latency_ticks": 0, 00:14:54.215 "copy_latency_ticks": 0, 00:14:54.215 "max_copy_latency_ticks": 0, 00:14:54.215 "min_copy_latency_ticks": 0, 00:14:54.215 "io_error": {} 00:14:54.215 } 00:14:54.215 ] 00:14:54.215 }' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=180228 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 169472 -lt 163076 ']' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 169472 -gt 180228 ']' 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:14:54.215 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:54.474 00:14:54.474 Latency(us) 00:14:54.474 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:54.474 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:14:54.474 Malloc_STAT : 2.18 46897.90 183.19 0.00 0.00 5445.81 1381.95 5784.26 00:14:54.474 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:14:54.474 Malloc_STAT : 2.18 39116.21 152.80 0.00 0.00 6528.18 1196.74 7807.33 00:14:54.474 =================================================================================================================== 00:14:54.474 Total : 86014.11 335.99 0.00 0.00 5938.20 1196.74 7807.33 00:14:54.474 0 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 4104474 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 4104474 ']' 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 4104474 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4104474 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4104474' 00:14:54.474 killing process with pid 4104474 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 4104474 00:14:54.474 Received shutdown signal, test time was about 2.256889 seconds 00:14:54.474 00:14:54.474 Latency(us) 00:14:54.474 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:54.474 =================================================================================================================== 00:14:54.474 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:14:54.474 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 4104474 00:14:54.734 17:08:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:14:54.734 00:14:54.734 real 0m2.999s 00:14:54.734 user 0m5.913s 00:14:54.734 sys 0m0.436s 00:14:54.734 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:54.734 17:08:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:14:54.734 ************************************ 00:14:54.734 END TEST bdev_stat 00:14:54.734 ************************************ 00:14:54.734 17:08:49 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:14:54.734 17:08:49 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:14:54.734 00:14:54.734 real 1m56.762s 00:14:54.734 user 7m10.551s 00:14:54.734 sys 0m23.914s 00:14:54.734 17:08:49 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:54.734 17:08:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:14:54.734 ************************************ 00:14:54.734 END TEST blockdev_general 00:14:54.734 ************************************ 00:14:54.734 17:08:50 -- common/autotest_common.sh@1142 -- # return 0 00:14:54.734 17:08:50 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:14:54.734 17:08:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:14:54.734 17:08:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:54.734 17:08:50 -- common/autotest_common.sh@10 -- # set +x 00:14:54.734 ************************************ 00:14:54.734 START TEST bdev_raid 00:14:54.734 ************************************ 00:14:54.734 17:08:50 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:14:54.734 * Looking for test storage... 00:14:54.993 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:14:54.993 17:08:50 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:14:54.993 17:08:50 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:14:54.993 17:08:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:14:54.993 17:08:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:54.993 17:08:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:54.993 ************************************ 00:14:54.993 START TEST raid_function_test_raid0 00:14:54.993 ************************************ 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=4104918 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 4104918' 00:14:54.993 Process raid pid: 4104918 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 4104918 /var/tmp/spdk-raid.sock 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 4104918 ']' 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:54.993 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:54.993 17:08:50 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:14:54.993 [2024-07-23 17:08:50.302201] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:54.993 [2024-07-23 17:08:50.302266] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:55.252 [2024-07-23 17:08:50.422612] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:55.252 [2024-07-23 17:08:50.472196] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:55.252 [2024-07-23 17:08:50.532442] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:55.252 [2024-07-23 17:08:50.532477] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:55.820 17:08:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:55.820 17:08:51 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:14:55.820 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:14:55.820 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:14:55.820 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:14:55.820 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:14:56.079 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:14:56.338 [2024-07-23 17:08:51.752761] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:14:56.338 [2024-07-23 17:08:51.754216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:14:56.338 [2024-07-23 17:08:51.754279] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2907210 00:14:56.338 [2024-07-23 17:08:51.754291] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:14:56.338 [2024-07-23 17:08:51.754466] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28fee40 00:14:56.338 [2024-07-23 17:08:51.754586] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2907210 00:14:56.338 [2024-07-23 17:08:51.754596] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x2907210 00:14:56.338 [2024-07-23 17:08:51.754695] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:56.338 Base_1 00:14:56.338 Base_2 00:14:56.597 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:14:56.597 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:56.597 17:08:51 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:14:56.856 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:14:57.115 [2024-07-23 17:08:52.530853] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28fee40 00:14:57.375 /dev/nbd0 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:57.375 1+0 records in 00:14:57.375 1+0 records out 00:14:57.375 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275432 s, 14.9 MB/s 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:14:57.375 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:14:57.634 { 00:14:57.634 "nbd_device": "/dev/nbd0", 00:14:57.634 "bdev_name": "raid" 00:14:57.634 } 00:14:57.634 ]' 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:14:57.634 { 00:14:57.634 "nbd_device": "/dev/nbd0", 00:14:57.634 "bdev_name": "raid" 00:14:57.634 } 00:14:57.634 ]' 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:14:57.634 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:14:57.635 4096+0 records in 00:14:57.635 4096+0 records out 00:14:57.635 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0312475 s, 67.1 MB/s 00:14:57.635 17:08:52 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:14:57.894 4096+0 records in 00:14:57.894 4096+0 records out 00:14:57.894 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.311241 s, 6.7 MB/s 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:14:57.894 128+0 records in 00:14:57.894 128+0 records out 00:14:57.894 65536 bytes (66 kB, 64 KiB) copied, 0.000511499 s, 128 MB/s 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:14:57.894 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:14:58.154 2035+0 records in 00:14:58.154 2035+0 records out 00:14:58.154 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0118117 s, 88.2 MB/s 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:14:58.154 456+0 records in 00:14:58.154 456+0 records out 00:14:58.154 233472 bytes (233 kB, 228 KiB) copied, 0.00275644 s, 84.7 MB/s 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:14:58.154 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:14:58.444 [2024-07-23 17:08:53.642540] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:14:58.444 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:14:58.703 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:14:58.703 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:14:58.703 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 4104918 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 4104918 ']' 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 4104918 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:58.704 17:08:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4104918 00:14:58.704 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:58.704 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:58.704 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4104918' 00:14:58.704 killing process with pid 4104918 00:14:58.704 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 4104918 00:14:58.704 [2024-07-23 17:08:54.019848] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:58.704 [2024-07-23 17:08:54.019921] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:58.704 [2024-07-23 17:08:54.019960] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:58.704 [2024-07-23 17:08:54.019972] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2907210 name raid, state offline 00:14:58.704 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 4104918 00:14:58.704 [2024-07-23 17:08:54.037457] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:58.963 17:08:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:14:58.963 00:14:58.963 real 0m3.998s 00:14:58.963 user 0m5.534s 00:14:58.963 sys 0m1.365s 00:14:58.963 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:58.963 17:08:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:14:58.963 ************************************ 00:14:58.963 END TEST raid_function_test_raid0 00:14:58.963 ************************************ 00:14:58.963 17:08:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:58.963 17:08:54 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:14:58.963 17:08:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:14:58.963 17:08:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:58.963 17:08:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:58.963 ************************************ 00:14:58.963 START TEST raid_function_test_concat 00:14:58.963 ************************************ 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=4105533 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 4105533' 00:14:58.963 Process raid pid: 4105533 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 4105533 /var/tmp/spdk-raid.sock 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 4105533 ']' 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:58.963 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:58.963 17:08:54 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:14:59.222 [2024-07-23 17:08:54.431218] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:14:59.222 [2024-07-23 17:08:54.431352] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:59.222 [2024-07-23 17:08:54.634521] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:59.481 [2024-07-23 17:08:54.688352] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:59.481 [2024-07-23 17:08:54.749322] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:59.481 [2024-07-23 17:08:54.749349] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:15:00.418 17:08:55 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:15:00.987 [2024-07-23 17:08:56.111575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:15:00.987 [2024-07-23 17:08:56.113032] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:15:00.987 [2024-07-23 17:08:56.113087] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2826210 00:15:00.987 [2024-07-23 17:08:56.113098] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:15:00.987 [2024-07-23 17:08:56.113273] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x281de40 00:15:00.987 [2024-07-23 17:08:56.113391] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2826210 00:15:00.987 [2024-07-23 17:08:56.113401] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x2826210 00:15:00.987 [2024-07-23 17:08:56.113496] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:00.987 Base_1 00:15:00.987 Base_2 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:15:00.987 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:15:01.556 [2024-07-23 17:08:56.881641] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x281de40 00:15:01.556 /dev/nbd0 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:01.556 1+0 records in 00:15:01.556 1+0 records out 00:15:01.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259721 s, 15.8 MB/s 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:15:01.556 17:08:56 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:15:01.815 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:15:01.815 { 00:15:01.815 "nbd_device": "/dev/nbd0", 00:15:01.815 "bdev_name": "raid" 00:15:01.815 } 00:15:01.815 ]' 00:15:01.815 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:15:01.815 { 00:15:01.815 "nbd_device": "/dev/nbd0", 00:15:01.815 "bdev_name": "raid" 00:15:01.815 } 00:15:01.815 ]' 00:15:01.815 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:15:02.074 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:15:02.074 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:15:02.074 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:15:02.074 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:15:02.074 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:15:02.075 4096+0 records in 00:15:02.075 4096+0 records out 00:15:02.075 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0301406 s, 69.6 MB/s 00:15:02.075 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:15:02.334 4096+0 records in 00:15:02.334 4096+0 records out 00:15:02.334 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.310778 s, 6.7 MB/s 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:15:02.334 128+0 records in 00:15:02.334 128+0 records out 00:15:02.334 65536 bytes (66 kB, 64 KiB) copied, 0.000834557 s, 78.5 MB/s 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:15:02.334 2035+0 records in 00:15:02.334 2035+0 records out 00:15:02.334 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0105528 s, 98.7 MB/s 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:15:02.334 456+0 records in 00:15:02.334 456+0 records out 00:15:02.334 233472 bytes (233 kB, 228 KiB) copied, 0.00277281 s, 84.2 MB/s 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:02.334 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:15:02.594 [2024-07-23 17:08:57.979347] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:15:02.594 17:08:57 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:15:02.853 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 4105533 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 4105533 ']' 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 4105533 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:02.854 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4105533 00:15:03.113 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:03.113 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:03.114 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4105533' 00:15:03.114 killing process with pid 4105533 00:15:03.114 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 4105533 00:15:03.114 [2024-07-23 17:08:58.278175] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:03.114 [2024-07-23 17:08:58.278236] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:03.114 [2024-07-23 17:08:58.278273] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:03.114 [2024-07-23 17:08:58.278285] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2826210 name raid, state offline 00:15:03.114 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 4105533 00:15:03.114 [2024-07-23 17:08:58.295312] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:03.114 17:08:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:15:03.114 00:15:03.114 real 0m4.170s 00:15:03.114 user 0m5.782s 00:15:03.114 sys 0m1.480s 00:15:03.114 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:03.114 17:08:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:15:03.114 ************************************ 00:15:03.114 END TEST raid_function_test_concat 00:15:03.114 ************************************ 00:15:03.373 17:08:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:03.373 17:08:58 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:15:03.373 17:08:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:15:03.373 17:08:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:03.373 17:08:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:03.373 ************************************ 00:15:03.373 START TEST raid0_resize_test 00:15:03.373 ************************************ 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=4106153 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 4106153' 00:15:03.373 Process raid pid: 4106153 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 4106153 /var/tmp/spdk-raid.sock 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 4106153 ']' 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:03.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:03.373 17:08:58 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.373 [2024-07-23 17:08:58.647922] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:03.373 [2024-07-23 17:08:58.647995] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:03.373 [2024-07-23 17:08:58.785095] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:03.632 [2024-07-23 17:08:58.840561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:03.632 [2024-07-23 17:08:58.905634] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:03.632 [2024-07-23 17:08:58.905672] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:04.198 17:08:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:04.198 17:08:59 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:15:04.198 17:08:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:15:04.456 Base_1 00:15:04.456 17:08:59 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:15:04.714 Base_2 00:15:04.714 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:15:04.973 [2024-07-23 17:09:00.234762] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:15:04.973 [2024-07-23 17:09:00.236337] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:15:04.973 [2024-07-23 17:09:00.236386] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xea92c0 00:15:04.973 [2024-07-23 17:09:00.236396] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:15:04.973 [2024-07-23 17:09:00.236612] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd01750 00:15:04.973 [2024-07-23 17:09:00.236715] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xea92c0 00:15:04.973 [2024-07-23 17:09:00.236724] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xea92c0 00:15:04.973 [2024-07-23 17:09:00.236832] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.973 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:15:05.231 [2024-07-23 17:09:00.487410] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:15:05.231 [2024-07-23 17:09:00.487431] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:15:05.231 true 00:15:05.231 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:15:05.231 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:15:05.489 [2024-07-23 17:09:00.668056] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:05.489 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:15:05.489 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:15:05.489 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:15:05.489 17:09:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:15:06.056 [2024-07-23 17:09:01.177205] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:15:06.056 [2024-07-23 17:09:01.177230] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:15:06.056 [2024-07-23 17:09:01.177256] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:15:06.056 true 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:15:06.056 [2024-07-23 17:09:01.434050] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 4106153 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 4106153 ']' 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 4106153 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:06.056 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4106153 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4106153' 00:15:06.315 killing process with pid 4106153 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 4106153 00:15:06.315 [2024-07-23 17:09:01.505563] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:06.315 [2024-07-23 17:09:01.505619] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.315 [2024-07-23 17:09:01.505663] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.315 [2024-07-23 17:09:01.505675] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xea92c0 name Raid, state offline 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 4106153 00:15:06.315 [2024-07-23 17:09:01.507050] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:15:06.315 00:15:06.315 real 0m3.114s 00:15:06.315 user 0m4.823s 00:15:06.315 sys 0m0.688s 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:06.315 17:09:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.315 ************************************ 00:15:06.315 END TEST raid0_resize_test 00:15:06.315 ************************************ 00:15:06.575 17:09:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:06.575 17:09:01 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:15:06.575 17:09:01 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:06.575 17:09:01 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:15:06.575 17:09:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:06.575 17:09:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:06.575 17:09:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:06.575 ************************************ 00:15:06.575 START TEST raid_state_function_test 00:15:06.575 ************************************ 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4106790 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4106790' 00:15:06.575 Process raid pid: 4106790 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4106790 /var/tmp/spdk-raid.sock 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4106790 ']' 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:06.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:06.575 17:09:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.575 [2024-07-23 17:09:01.852103] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:06.575 [2024-07-23 17:09:01.852175] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:06.575 [2024-07-23 17:09:01.987703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:06.834 [2024-07-23 17:09:02.043632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:06.834 [2024-07-23 17:09:02.099683] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:06.834 [2024-07-23 17:09:02.099718] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.401 17:09:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:07.401 17:09:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:07.401 17:09:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:07.660 [2024-07-23 17:09:03.025384] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:07.660 [2024-07-23 17:09:03.025426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:07.660 [2024-07-23 17:09:03.025437] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.660 [2024-07-23 17:09:03.025449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.660 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.228 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.228 "name": "Existed_Raid", 00:15:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.228 "strip_size_kb": 64, 00:15:08.228 "state": "configuring", 00:15:08.228 "raid_level": "raid0", 00:15:08.228 "superblock": false, 00:15:08.228 "num_base_bdevs": 2, 00:15:08.228 "num_base_bdevs_discovered": 0, 00:15:08.228 "num_base_bdevs_operational": 2, 00:15:08.228 "base_bdevs_list": [ 00:15:08.228 { 00:15:08.228 "name": "BaseBdev1", 00:15:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.228 "is_configured": false, 00:15:08.228 "data_offset": 0, 00:15:08.228 "data_size": 0 00:15:08.228 }, 00:15:08.228 { 00:15:08.228 "name": "BaseBdev2", 00:15:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.228 "is_configured": false, 00:15:08.228 "data_offset": 0, 00:15:08.228 "data_size": 0 00:15:08.228 } 00:15:08.228 ] 00:15:08.228 }' 00:15:08.228 17:09:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.228 17:09:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.796 17:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:09.055 [2024-07-23 17:09:04.388830] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:09.055 [2024-07-23 17:09:04.388865] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26cd410 name Existed_Raid, state configuring 00:15:09.055 17:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:09.315 [2024-07-23 17:09:04.637494] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:09.315 [2024-07-23 17:09:04.637524] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:09.315 [2024-07-23 17:09:04.637534] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:09.315 [2024-07-23 17:09:04.637545] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:09.315 17:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:09.574 [2024-07-23 17:09:04.900008] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:09.574 BaseBdev1 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:09.574 17:09:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:09.833 17:09:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:10.093 [ 00:15:10.093 { 00:15:10.093 "name": "BaseBdev1", 00:15:10.093 "aliases": [ 00:15:10.093 "3c5007a8-bec6-4ff2-888f-e0893fa80fac" 00:15:10.093 ], 00:15:10.093 "product_name": "Malloc disk", 00:15:10.093 "block_size": 512, 00:15:10.093 "num_blocks": 65536, 00:15:10.093 "uuid": "3c5007a8-bec6-4ff2-888f-e0893fa80fac", 00:15:10.093 "assigned_rate_limits": { 00:15:10.093 "rw_ios_per_sec": 0, 00:15:10.093 "rw_mbytes_per_sec": 0, 00:15:10.093 "r_mbytes_per_sec": 0, 00:15:10.093 "w_mbytes_per_sec": 0 00:15:10.093 }, 00:15:10.093 "claimed": true, 00:15:10.093 "claim_type": "exclusive_write", 00:15:10.093 "zoned": false, 00:15:10.093 "supported_io_types": { 00:15:10.093 "read": true, 00:15:10.093 "write": true, 00:15:10.093 "unmap": true, 00:15:10.093 "flush": true, 00:15:10.093 "reset": true, 00:15:10.093 "nvme_admin": false, 00:15:10.093 "nvme_io": false, 00:15:10.093 "nvme_io_md": false, 00:15:10.093 "write_zeroes": true, 00:15:10.093 "zcopy": true, 00:15:10.093 "get_zone_info": false, 00:15:10.093 "zone_management": false, 00:15:10.093 "zone_append": false, 00:15:10.093 "compare": false, 00:15:10.093 "compare_and_write": false, 00:15:10.093 "abort": true, 00:15:10.093 "seek_hole": false, 00:15:10.093 "seek_data": false, 00:15:10.093 "copy": true, 00:15:10.093 "nvme_iov_md": false 00:15:10.093 }, 00:15:10.093 "memory_domains": [ 00:15:10.093 { 00:15:10.093 "dma_device_id": "system", 00:15:10.093 "dma_device_type": 1 00:15:10.093 }, 00:15:10.093 { 00:15:10.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.093 "dma_device_type": 2 00:15:10.093 } 00:15:10.093 ], 00:15:10.093 "driver_specific": {} 00:15:10.093 } 00:15:10.093 ] 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.093 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.352 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.352 "name": "Existed_Raid", 00:15:10.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.352 "strip_size_kb": 64, 00:15:10.352 "state": "configuring", 00:15:10.352 "raid_level": "raid0", 00:15:10.352 "superblock": false, 00:15:10.352 "num_base_bdevs": 2, 00:15:10.352 "num_base_bdevs_discovered": 1, 00:15:10.352 "num_base_bdevs_operational": 2, 00:15:10.352 "base_bdevs_list": [ 00:15:10.352 { 00:15:10.353 "name": "BaseBdev1", 00:15:10.353 "uuid": "3c5007a8-bec6-4ff2-888f-e0893fa80fac", 00:15:10.353 "is_configured": true, 00:15:10.353 "data_offset": 0, 00:15:10.353 "data_size": 65536 00:15:10.353 }, 00:15:10.353 { 00:15:10.353 "name": "BaseBdev2", 00:15:10.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.353 "is_configured": false, 00:15:10.353 "data_offset": 0, 00:15:10.353 "data_size": 0 00:15:10.353 } 00:15:10.353 ] 00:15:10.353 }' 00:15:10.353 17:09:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.353 17:09:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.921 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:11.181 [2024-07-23 17:09:06.456126] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:11.181 [2024-07-23 17:09:06.456171] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26ccd40 name Existed_Raid, state configuring 00:15:11.181 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:11.440 [2024-07-23 17:09:06.704808] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:11.440 [2024-07-23 17:09:06.706272] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:11.440 [2024-07-23 17:09:06.706306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.440 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.698 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.698 "name": "Existed_Raid", 00:15:11.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.698 "strip_size_kb": 64, 00:15:11.698 "state": "configuring", 00:15:11.698 "raid_level": "raid0", 00:15:11.698 "superblock": false, 00:15:11.698 "num_base_bdevs": 2, 00:15:11.698 "num_base_bdevs_discovered": 1, 00:15:11.698 "num_base_bdevs_operational": 2, 00:15:11.698 "base_bdevs_list": [ 00:15:11.698 { 00:15:11.698 "name": "BaseBdev1", 00:15:11.698 "uuid": "3c5007a8-bec6-4ff2-888f-e0893fa80fac", 00:15:11.698 "is_configured": true, 00:15:11.698 "data_offset": 0, 00:15:11.698 "data_size": 65536 00:15:11.698 }, 00:15:11.698 { 00:15:11.698 "name": "BaseBdev2", 00:15:11.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.698 "is_configured": false, 00:15:11.698 "data_offset": 0, 00:15:11.698 "data_size": 0 00:15:11.698 } 00:15:11.698 ] 00:15:11.698 }' 00:15:11.698 17:09:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.698 17:09:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.266 17:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:12.525 [2024-07-23 17:09:07.811165] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:12.525 [2024-07-23 17:09:07.811202] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26cc990 00:15:12.525 [2024-07-23 17:09:07.811211] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:15:12.525 [2024-07-23 17:09:07.811460] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26cfed0 00:15:12.525 [2024-07-23 17:09:07.811573] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26cc990 00:15:12.525 [2024-07-23 17:09:07.811583] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26cc990 00:15:12.525 [2024-07-23 17:09:07.811746] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.525 BaseBdev2 00:15:12.525 17:09:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:12.525 17:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:12.525 17:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:12.525 17:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:12.526 17:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:12.526 17:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:12.526 17:09:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.827 17:09:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:13.087 [ 00:15:13.087 { 00:15:13.087 "name": "BaseBdev2", 00:15:13.087 "aliases": [ 00:15:13.087 "529e6b7f-944c-4726-a9db-10bde7ad3d70" 00:15:13.087 ], 00:15:13.087 "product_name": "Malloc disk", 00:15:13.087 "block_size": 512, 00:15:13.087 "num_blocks": 65536, 00:15:13.087 "uuid": "529e6b7f-944c-4726-a9db-10bde7ad3d70", 00:15:13.087 "assigned_rate_limits": { 00:15:13.087 "rw_ios_per_sec": 0, 00:15:13.087 "rw_mbytes_per_sec": 0, 00:15:13.087 "r_mbytes_per_sec": 0, 00:15:13.087 "w_mbytes_per_sec": 0 00:15:13.087 }, 00:15:13.087 "claimed": true, 00:15:13.087 "claim_type": "exclusive_write", 00:15:13.087 "zoned": false, 00:15:13.087 "supported_io_types": { 00:15:13.087 "read": true, 00:15:13.087 "write": true, 00:15:13.087 "unmap": true, 00:15:13.087 "flush": true, 00:15:13.087 "reset": true, 00:15:13.087 "nvme_admin": false, 00:15:13.087 "nvme_io": false, 00:15:13.087 "nvme_io_md": false, 00:15:13.087 "write_zeroes": true, 00:15:13.087 "zcopy": true, 00:15:13.087 "get_zone_info": false, 00:15:13.087 "zone_management": false, 00:15:13.087 "zone_append": false, 00:15:13.087 "compare": false, 00:15:13.087 "compare_and_write": false, 00:15:13.087 "abort": true, 00:15:13.087 "seek_hole": false, 00:15:13.087 "seek_data": false, 00:15:13.087 "copy": true, 00:15:13.087 "nvme_iov_md": false 00:15:13.087 }, 00:15:13.087 "memory_domains": [ 00:15:13.087 { 00:15:13.087 "dma_device_id": "system", 00:15:13.087 "dma_device_type": 1 00:15:13.087 }, 00:15:13.087 { 00:15:13.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.087 "dma_device_type": 2 00:15:13.087 } 00:15:13.087 ], 00:15:13.087 "driver_specific": {} 00:15:13.087 } 00:15:13.087 ] 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:13.087 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.088 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.088 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.088 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.088 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.088 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.347 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.347 "name": "Existed_Raid", 00:15:13.347 "uuid": "c4d55b64-947b-4c15-a978-7761b1020da9", 00:15:13.347 "strip_size_kb": 64, 00:15:13.347 "state": "online", 00:15:13.347 "raid_level": "raid0", 00:15:13.347 "superblock": false, 00:15:13.347 "num_base_bdevs": 2, 00:15:13.347 "num_base_bdevs_discovered": 2, 00:15:13.347 "num_base_bdevs_operational": 2, 00:15:13.347 "base_bdevs_list": [ 00:15:13.347 { 00:15:13.347 "name": "BaseBdev1", 00:15:13.347 "uuid": "3c5007a8-bec6-4ff2-888f-e0893fa80fac", 00:15:13.347 "is_configured": true, 00:15:13.347 "data_offset": 0, 00:15:13.347 "data_size": 65536 00:15:13.347 }, 00:15:13.347 { 00:15:13.347 "name": "BaseBdev2", 00:15:13.347 "uuid": "529e6b7f-944c-4726-a9db-10bde7ad3d70", 00:15:13.347 "is_configured": true, 00:15:13.347 "data_offset": 0, 00:15:13.347 "data_size": 65536 00:15:13.347 } 00:15:13.347 ] 00:15:13.347 }' 00:15:13.347 17:09:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.347 17:09:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:13.915 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:14.175 [2024-07-23 17:09:09.411689] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:14.175 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:14.175 "name": "Existed_Raid", 00:15:14.175 "aliases": [ 00:15:14.175 "c4d55b64-947b-4c15-a978-7761b1020da9" 00:15:14.175 ], 00:15:14.175 "product_name": "Raid Volume", 00:15:14.175 "block_size": 512, 00:15:14.175 "num_blocks": 131072, 00:15:14.175 "uuid": "c4d55b64-947b-4c15-a978-7761b1020da9", 00:15:14.175 "assigned_rate_limits": { 00:15:14.175 "rw_ios_per_sec": 0, 00:15:14.175 "rw_mbytes_per_sec": 0, 00:15:14.175 "r_mbytes_per_sec": 0, 00:15:14.175 "w_mbytes_per_sec": 0 00:15:14.175 }, 00:15:14.175 "claimed": false, 00:15:14.175 "zoned": false, 00:15:14.175 "supported_io_types": { 00:15:14.175 "read": true, 00:15:14.175 "write": true, 00:15:14.175 "unmap": true, 00:15:14.175 "flush": true, 00:15:14.175 "reset": true, 00:15:14.175 "nvme_admin": false, 00:15:14.175 "nvme_io": false, 00:15:14.175 "nvme_io_md": false, 00:15:14.175 "write_zeroes": true, 00:15:14.175 "zcopy": false, 00:15:14.175 "get_zone_info": false, 00:15:14.175 "zone_management": false, 00:15:14.175 "zone_append": false, 00:15:14.175 "compare": false, 00:15:14.175 "compare_and_write": false, 00:15:14.175 "abort": false, 00:15:14.175 "seek_hole": false, 00:15:14.175 "seek_data": false, 00:15:14.175 "copy": false, 00:15:14.175 "nvme_iov_md": false 00:15:14.175 }, 00:15:14.175 "memory_domains": [ 00:15:14.175 { 00:15:14.175 "dma_device_id": "system", 00:15:14.175 "dma_device_type": 1 00:15:14.175 }, 00:15:14.175 { 00:15:14.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.175 "dma_device_type": 2 00:15:14.175 }, 00:15:14.175 { 00:15:14.175 "dma_device_id": "system", 00:15:14.175 "dma_device_type": 1 00:15:14.175 }, 00:15:14.175 { 00:15:14.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.175 "dma_device_type": 2 00:15:14.175 } 00:15:14.175 ], 00:15:14.175 "driver_specific": { 00:15:14.175 "raid": { 00:15:14.175 "uuid": "c4d55b64-947b-4c15-a978-7761b1020da9", 00:15:14.175 "strip_size_kb": 64, 00:15:14.175 "state": "online", 00:15:14.175 "raid_level": "raid0", 00:15:14.175 "superblock": false, 00:15:14.175 "num_base_bdevs": 2, 00:15:14.175 "num_base_bdevs_discovered": 2, 00:15:14.175 "num_base_bdevs_operational": 2, 00:15:14.175 "base_bdevs_list": [ 00:15:14.175 { 00:15:14.175 "name": "BaseBdev1", 00:15:14.175 "uuid": "3c5007a8-bec6-4ff2-888f-e0893fa80fac", 00:15:14.175 "is_configured": true, 00:15:14.175 "data_offset": 0, 00:15:14.175 "data_size": 65536 00:15:14.175 }, 00:15:14.175 { 00:15:14.175 "name": "BaseBdev2", 00:15:14.175 "uuid": "529e6b7f-944c-4726-a9db-10bde7ad3d70", 00:15:14.175 "is_configured": true, 00:15:14.175 "data_offset": 0, 00:15:14.175 "data_size": 65536 00:15:14.175 } 00:15:14.175 ] 00:15:14.175 } 00:15:14.175 } 00:15:14.175 }' 00:15:14.175 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:14.175 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:14.175 BaseBdev2' 00:15:14.175 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.175 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:14.175 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.434 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.434 "name": "BaseBdev1", 00:15:14.434 "aliases": [ 00:15:14.434 "3c5007a8-bec6-4ff2-888f-e0893fa80fac" 00:15:14.434 ], 00:15:14.434 "product_name": "Malloc disk", 00:15:14.434 "block_size": 512, 00:15:14.434 "num_blocks": 65536, 00:15:14.434 "uuid": "3c5007a8-bec6-4ff2-888f-e0893fa80fac", 00:15:14.434 "assigned_rate_limits": { 00:15:14.434 "rw_ios_per_sec": 0, 00:15:14.434 "rw_mbytes_per_sec": 0, 00:15:14.434 "r_mbytes_per_sec": 0, 00:15:14.434 "w_mbytes_per_sec": 0 00:15:14.434 }, 00:15:14.434 "claimed": true, 00:15:14.434 "claim_type": "exclusive_write", 00:15:14.434 "zoned": false, 00:15:14.434 "supported_io_types": { 00:15:14.434 "read": true, 00:15:14.434 "write": true, 00:15:14.434 "unmap": true, 00:15:14.434 "flush": true, 00:15:14.434 "reset": true, 00:15:14.434 "nvme_admin": false, 00:15:14.434 "nvme_io": false, 00:15:14.434 "nvme_io_md": false, 00:15:14.434 "write_zeroes": true, 00:15:14.434 "zcopy": true, 00:15:14.434 "get_zone_info": false, 00:15:14.434 "zone_management": false, 00:15:14.434 "zone_append": false, 00:15:14.434 "compare": false, 00:15:14.434 "compare_and_write": false, 00:15:14.434 "abort": true, 00:15:14.434 "seek_hole": false, 00:15:14.435 "seek_data": false, 00:15:14.435 "copy": true, 00:15:14.435 "nvme_iov_md": false 00:15:14.435 }, 00:15:14.435 "memory_domains": [ 00:15:14.435 { 00:15:14.435 "dma_device_id": "system", 00:15:14.435 "dma_device_type": 1 00:15:14.435 }, 00:15:14.435 { 00:15:14.435 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.435 "dma_device_type": 2 00:15:14.435 } 00:15:14.435 ], 00:15:14.435 "driver_specific": {} 00:15:14.435 }' 00:15:14.435 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.435 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.435 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.435 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.694 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.694 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:14.694 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.694 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.694 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:14.694 17:09:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.694 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.694 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:14.694 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.694 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:14.694 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.953 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.953 "name": "BaseBdev2", 00:15:14.953 "aliases": [ 00:15:14.953 "529e6b7f-944c-4726-a9db-10bde7ad3d70" 00:15:14.953 ], 00:15:14.953 "product_name": "Malloc disk", 00:15:14.953 "block_size": 512, 00:15:14.953 "num_blocks": 65536, 00:15:14.953 "uuid": "529e6b7f-944c-4726-a9db-10bde7ad3d70", 00:15:14.953 "assigned_rate_limits": { 00:15:14.953 "rw_ios_per_sec": 0, 00:15:14.953 "rw_mbytes_per_sec": 0, 00:15:14.953 "r_mbytes_per_sec": 0, 00:15:14.953 "w_mbytes_per_sec": 0 00:15:14.953 }, 00:15:14.953 "claimed": true, 00:15:14.953 "claim_type": "exclusive_write", 00:15:14.953 "zoned": false, 00:15:14.953 "supported_io_types": { 00:15:14.953 "read": true, 00:15:14.953 "write": true, 00:15:14.953 "unmap": true, 00:15:14.953 "flush": true, 00:15:14.953 "reset": true, 00:15:14.953 "nvme_admin": false, 00:15:14.953 "nvme_io": false, 00:15:14.953 "nvme_io_md": false, 00:15:14.953 "write_zeroes": true, 00:15:14.953 "zcopy": true, 00:15:14.953 "get_zone_info": false, 00:15:14.953 "zone_management": false, 00:15:14.953 "zone_append": false, 00:15:14.953 "compare": false, 00:15:14.953 "compare_and_write": false, 00:15:14.953 "abort": true, 00:15:14.953 "seek_hole": false, 00:15:14.953 "seek_data": false, 00:15:14.953 "copy": true, 00:15:14.953 "nvme_iov_md": false 00:15:14.953 }, 00:15:14.953 "memory_domains": [ 00:15:14.953 { 00:15:14.953 "dma_device_id": "system", 00:15:14.953 "dma_device_type": 1 00:15:14.953 }, 00:15:14.953 { 00:15:14.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.953 "dma_device_type": 2 00:15:14.953 } 00:15:14.953 ], 00:15:14.953 "driver_specific": {} 00:15:14.953 }' 00:15:14.953 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.953 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.953 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.953 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.212 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:15.471 [2024-07-23 17:09:10.847267] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:15.471 [2024-07-23 17:09:10.847299] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:15.471 [2024-07-23 17:09:10.847344] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.471 17:09:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.730 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.730 "name": "Existed_Raid", 00:15:15.730 "uuid": "c4d55b64-947b-4c15-a978-7761b1020da9", 00:15:15.730 "strip_size_kb": 64, 00:15:15.730 "state": "offline", 00:15:15.730 "raid_level": "raid0", 00:15:15.730 "superblock": false, 00:15:15.730 "num_base_bdevs": 2, 00:15:15.730 "num_base_bdevs_discovered": 1, 00:15:15.730 "num_base_bdevs_operational": 1, 00:15:15.730 "base_bdevs_list": [ 00:15:15.730 { 00:15:15.730 "name": null, 00:15:15.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.730 "is_configured": false, 00:15:15.730 "data_offset": 0, 00:15:15.730 "data_size": 65536 00:15:15.730 }, 00:15:15.730 { 00:15:15.730 "name": "BaseBdev2", 00:15:15.730 "uuid": "529e6b7f-944c-4726-a9db-10bde7ad3d70", 00:15:15.730 "is_configured": true, 00:15:15.730 "data_offset": 0, 00:15:15.730 "data_size": 65536 00:15:15.730 } 00:15:15.730 ] 00:15:15.730 }' 00:15:15.730 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.730 17:09:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.299 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:16.299 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:16.558 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.558 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:16.558 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:16.558 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:16.558 17:09:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:16.817 [2024-07-23 17:09:12.120564] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:16.817 [2024-07-23 17:09:12.120615] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26cc990 name Existed_Raid, state offline 00:15:16.817 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:16.817 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:16.817 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.817 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4106790 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4106790 ']' 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4106790 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4106790 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4106790' 00:15:17.077 killing process with pid 4106790 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4106790 00:15:17.077 [2024-07-23 17:09:12.382337] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:17.077 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4106790 00:15:17.077 [2024-07-23 17:09:12.383215] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:17.337 00:15:17.337 real 0m10.802s 00:15:17.337 user 0m19.268s 00:15:17.337 sys 0m1.986s 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.337 ************************************ 00:15:17.337 END TEST raid_state_function_test 00:15:17.337 ************************************ 00:15:17.337 17:09:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:17.337 17:09:12 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:15:17.337 17:09:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:17.337 17:09:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:17.337 17:09:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:17.337 ************************************ 00:15:17.337 START TEST raid_state_function_test_sb 00:15:17.337 ************************************ 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4108847 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4108847' 00:15:17.337 Process raid pid: 4108847 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4108847 /var/tmp/spdk-raid.sock 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4108847 ']' 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:17.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:17.337 17:09:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.337 [2024-07-23 17:09:12.746762] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:17.337 [2024-07-23 17:09:12.746824] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:17.597 [2024-07-23 17:09:12.879409] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:17.597 [2024-07-23 17:09:12.929813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:17.597 [2024-07-23 17:09:12.997388] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:17.597 [2024-07-23 17:09:12.997418] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:18.533 [2024-07-23 17:09:13.904056] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:18.533 [2024-07-23 17:09:13.904098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:18.533 [2024-07-23 17:09:13.904109] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:18.533 [2024-07-23 17:09:13.904120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.533 17:09:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.792 17:09:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.792 "name": "Existed_Raid", 00:15:18.792 "uuid": "0bdb0715-1df5-43da-8926-cdae3565a122", 00:15:18.792 "strip_size_kb": 64, 00:15:18.792 "state": "configuring", 00:15:18.792 "raid_level": "raid0", 00:15:18.792 "superblock": true, 00:15:18.792 "num_base_bdevs": 2, 00:15:18.792 "num_base_bdevs_discovered": 0, 00:15:18.792 "num_base_bdevs_operational": 2, 00:15:18.792 "base_bdevs_list": [ 00:15:18.792 { 00:15:18.792 "name": "BaseBdev1", 00:15:18.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.792 "is_configured": false, 00:15:18.792 "data_offset": 0, 00:15:18.792 "data_size": 0 00:15:18.792 }, 00:15:18.792 { 00:15:18.792 "name": "BaseBdev2", 00:15:18.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.792 "is_configured": false, 00:15:18.792 "data_offset": 0, 00:15:18.792 "data_size": 0 00:15:18.792 } 00:15:18.792 ] 00:15:18.792 }' 00:15:18.792 17:09:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.792 17:09:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.361 17:09:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:19.620 [2024-07-23 17:09:14.994783] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:19.620 [2024-07-23 17:09:14.994813] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x840410 name Existed_Raid, state configuring 00:15:19.620 17:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:19.879 [2024-07-23 17:09:15.243456] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:19.879 [2024-07-23 17:09:15.243483] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:19.879 [2024-07-23 17:09:15.243493] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:19.879 [2024-07-23 17:09:15.243504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:19.879 17:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:20.138 [2024-07-23 17:09:15.501977] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:20.138 BaseBdev1 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.138 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.397 17:09:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:20.656 [ 00:15:20.656 { 00:15:20.656 "name": "BaseBdev1", 00:15:20.656 "aliases": [ 00:15:20.656 "022eb545-8487-4dda-9f0f-4ac0b3477e5d" 00:15:20.656 ], 00:15:20.656 "product_name": "Malloc disk", 00:15:20.656 "block_size": 512, 00:15:20.656 "num_blocks": 65536, 00:15:20.656 "uuid": "022eb545-8487-4dda-9f0f-4ac0b3477e5d", 00:15:20.656 "assigned_rate_limits": { 00:15:20.656 "rw_ios_per_sec": 0, 00:15:20.656 "rw_mbytes_per_sec": 0, 00:15:20.656 "r_mbytes_per_sec": 0, 00:15:20.656 "w_mbytes_per_sec": 0 00:15:20.656 }, 00:15:20.656 "claimed": true, 00:15:20.656 "claim_type": "exclusive_write", 00:15:20.656 "zoned": false, 00:15:20.656 "supported_io_types": { 00:15:20.656 "read": true, 00:15:20.656 "write": true, 00:15:20.656 "unmap": true, 00:15:20.657 "flush": true, 00:15:20.657 "reset": true, 00:15:20.657 "nvme_admin": false, 00:15:20.657 "nvme_io": false, 00:15:20.657 "nvme_io_md": false, 00:15:20.657 "write_zeroes": true, 00:15:20.657 "zcopy": true, 00:15:20.657 "get_zone_info": false, 00:15:20.657 "zone_management": false, 00:15:20.657 "zone_append": false, 00:15:20.657 "compare": false, 00:15:20.657 "compare_and_write": false, 00:15:20.657 "abort": true, 00:15:20.657 "seek_hole": false, 00:15:20.657 "seek_data": false, 00:15:20.657 "copy": true, 00:15:20.657 "nvme_iov_md": false 00:15:20.657 }, 00:15:20.657 "memory_domains": [ 00:15:20.657 { 00:15:20.657 "dma_device_id": "system", 00:15:20.657 "dma_device_type": 1 00:15:20.657 }, 00:15:20.657 { 00:15:20.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.657 "dma_device_type": 2 00:15:20.657 } 00:15:20.657 ], 00:15:20.657 "driver_specific": {} 00:15:20.657 } 00:15:20.657 ] 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.657 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.915 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.915 "name": "Existed_Raid", 00:15:20.915 "uuid": "19f40dc6-7190-44f2-bef1-753add096741", 00:15:20.915 "strip_size_kb": 64, 00:15:20.915 "state": "configuring", 00:15:20.915 "raid_level": "raid0", 00:15:20.915 "superblock": true, 00:15:20.915 "num_base_bdevs": 2, 00:15:20.915 "num_base_bdevs_discovered": 1, 00:15:20.915 "num_base_bdevs_operational": 2, 00:15:20.915 "base_bdevs_list": [ 00:15:20.915 { 00:15:20.915 "name": "BaseBdev1", 00:15:20.915 "uuid": "022eb545-8487-4dda-9f0f-4ac0b3477e5d", 00:15:20.915 "is_configured": true, 00:15:20.915 "data_offset": 2048, 00:15:20.915 "data_size": 63488 00:15:20.915 }, 00:15:20.915 { 00:15:20.915 "name": "BaseBdev2", 00:15:20.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.915 "is_configured": false, 00:15:20.915 "data_offset": 0, 00:15:20.915 "data_size": 0 00:15:20.915 } 00:15:20.915 ] 00:15:20.915 }' 00:15:20.915 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.915 17:09:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.850 17:09:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:21.850 [2024-07-23 17:09:17.134297] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:21.850 [2024-07-23 17:09:17.134333] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x83fd40 name Existed_Raid, state configuring 00:15:21.850 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:22.108 [2024-07-23 17:09:17.382997] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:22.108 [2024-07-23 17:09:17.384431] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:22.108 [2024-07-23 17:09:17.384463] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.108 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.367 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.367 "name": "Existed_Raid", 00:15:22.367 "uuid": "f57b80fb-3b81-4586-8b37-47417a6886d6", 00:15:22.367 "strip_size_kb": 64, 00:15:22.367 "state": "configuring", 00:15:22.367 "raid_level": "raid0", 00:15:22.367 "superblock": true, 00:15:22.367 "num_base_bdevs": 2, 00:15:22.367 "num_base_bdevs_discovered": 1, 00:15:22.367 "num_base_bdevs_operational": 2, 00:15:22.367 "base_bdevs_list": [ 00:15:22.367 { 00:15:22.367 "name": "BaseBdev1", 00:15:22.367 "uuid": "022eb545-8487-4dda-9f0f-4ac0b3477e5d", 00:15:22.367 "is_configured": true, 00:15:22.367 "data_offset": 2048, 00:15:22.367 "data_size": 63488 00:15:22.367 }, 00:15:22.367 { 00:15:22.367 "name": "BaseBdev2", 00:15:22.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.367 "is_configured": false, 00:15:22.367 "data_offset": 0, 00:15:22.367 "data_size": 0 00:15:22.367 } 00:15:22.367 ] 00:15:22.367 }' 00:15:22.367 17:09:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.367 17:09:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.302 17:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:23.302 [2024-07-23 17:09:18.705909] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:23.302 [2024-07-23 17:09:18.706051] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x83f990 00:15:23.302 [2024-07-23 17:09:18.706069] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:23.302 [2024-07-23 17:09:18.706237] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x843a60 00:15:23.302 [2024-07-23 17:09:18.706351] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x83f990 00:15:23.302 [2024-07-23 17:09:18.706361] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x83f990 00:15:23.302 [2024-07-23 17:09:18.706450] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:23.302 BaseBdev2 00:15:23.560 17:09:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.561 17:09:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:23.819 [ 00:15:23.819 { 00:15:23.819 "name": "BaseBdev2", 00:15:23.819 "aliases": [ 00:15:23.819 "4a8776fe-9a08-4054-a221-7c1a091b4120" 00:15:23.819 ], 00:15:23.819 "product_name": "Malloc disk", 00:15:23.819 "block_size": 512, 00:15:23.819 "num_blocks": 65536, 00:15:23.819 "uuid": "4a8776fe-9a08-4054-a221-7c1a091b4120", 00:15:23.819 "assigned_rate_limits": { 00:15:23.819 "rw_ios_per_sec": 0, 00:15:23.819 "rw_mbytes_per_sec": 0, 00:15:23.819 "r_mbytes_per_sec": 0, 00:15:23.819 "w_mbytes_per_sec": 0 00:15:23.819 }, 00:15:23.819 "claimed": true, 00:15:23.819 "claim_type": "exclusive_write", 00:15:23.819 "zoned": false, 00:15:23.819 "supported_io_types": { 00:15:23.819 "read": true, 00:15:23.819 "write": true, 00:15:23.819 "unmap": true, 00:15:23.819 "flush": true, 00:15:23.819 "reset": true, 00:15:23.819 "nvme_admin": false, 00:15:23.819 "nvme_io": false, 00:15:23.819 "nvme_io_md": false, 00:15:23.819 "write_zeroes": true, 00:15:23.819 "zcopy": true, 00:15:23.819 "get_zone_info": false, 00:15:23.819 "zone_management": false, 00:15:23.819 "zone_append": false, 00:15:23.819 "compare": false, 00:15:23.819 "compare_and_write": false, 00:15:23.819 "abort": true, 00:15:23.819 "seek_hole": false, 00:15:23.819 "seek_data": false, 00:15:23.819 "copy": true, 00:15:23.819 "nvme_iov_md": false 00:15:23.819 }, 00:15:23.819 "memory_domains": [ 00:15:23.819 { 00:15:23.819 "dma_device_id": "system", 00:15:23.819 "dma_device_type": 1 00:15:23.819 }, 00:15:23.819 { 00:15:23.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.819 "dma_device_type": 2 00:15:23.819 } 00:15:23.819 ], 00:15:23.819 "driver_specific": {} 00:15:23.819 } 00:15:23.819 ] 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.819 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.078 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.078 "name": "Existed_Raid", 00:15:24.078 "uuid": "f57b80fb-3b81-4586-8b37-47417a6886d6", 00:15:24.078 "strip_size_kb": 64, 00:15:24.078 "state": "online", 00:15:24.078 "raid_level": "raid0", 00:15:24.078 "superblock": true, 00:15:24.078 "num_base_bdevs": 2, 00:15:24.078 "num_base_bdevs_discovered": 2, 00:15:24.078 "num_base_bdevs_operational": 2, 00:15:24.078 "base_bdevs_list": [ 00:15:24.078 { 00:15:24.078 "name": "BaseBdev1", 00:15:24.078 "uuid": "022eb545-8487-4dda-9f0f-4ac0b3477e5d", 00:15:24.078 "is_configured": true, 00:15:24.078 "data_offset": 2048, 00:15:24.078 "data_size": 63488 00:15:24.078 }, 00:15:24.078 { 00:15:24.078 "name": "BaseBdev2", 00:15:24.078 "uuid": "4a8776fe-9a08-4054-a221-7c1a091b4120", 00:15:24.078 "is_configured": true, 00:15:24.078 "data_offset": 2048, 00:15:24.078 "data_size": 63488 00:15:24.078 } 00:15:24.078 ] 00:15:24.078 }' 00:15:24.078 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.078 17:09:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:24.645 17:09:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:24.904 [2024-07-23 17:09:20.150014] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:24.904 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:24.904 "name": "Existed_Raid", 00:15:24.904 "aliases": [ 00:15:24.904 "f57b80fb-3b81-4586-8b37-47417a6886d6" 00:15:24.904 ], 00:15:24.904 "product_name": "Raid Volume", 00:15:24.904 "block_size": 512, 00:15:24.904 "num_blocks": 126976, 00:15:24.904 "uuid": "f57b80fb-3b81-4586-8b37-47417a6886d6", 00:15:24.904 "assigned_rate_limits": { 00:15:24.904 "rw_ios_per_sec": 0, 00:15:24.904 "rw_mbytes_per_sec": 0, 00:15:24.904 "r_mbytes_per_sec": 0, 00:15:24.904 "w_mbytes_per_sec": 0 00:15:24.904 }, 00:15:24.904 "claimed": false, 00:15:24.904 "zoned": false, 00:15:24.904 "supported_io_types": { 00:15:24.904 "read": true, 00:15:24.904 "write": true, 00:15:24.904 "unmap": true, 00:15:24.904 "flush": true, 00:15:24.904 "reset": true, 00:15:24.904 "nvme_admin": false, 00:15:24.904 "nvme_io": false, 00:15:24.904 "nvme_io_md": false, 00:15:24.904 "write_zeroes": true, 00:15:24.904 "zcopy": false, 00:15:24.904 "get_zone_info": false, 00:15:24.904 "zone_management": false, 00:15:24.904 "zone_append": false, 00:15:24.904 "compare": false, 00:15:24.904 "compare_and_write": false, 00:15:24.904 "abort": false, 00:15:24.904 "seek_hole": false, 00:15:24.904 "seek_data": false, 00:15:24.904 "copy": false, 00:15:24.904 "nvme_iov_md": false 00:15:24.904 }, 00:15:24.904 "memory_domains": [ 00:15:24.904 { 00:15:24.904 "dma_device_id": "system", 00:15:24.904 "dma_device_type": 1 00:15:24.904 }, 00:15:24.904 { 00:15:24.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.904 "dma_device_type": 2 00:15:24.904 }, 00:15:24.904 { 00:15:24.904 "dma_device_id": "system", 00:15:24.904 "dma_device_type": 1 00:15:24.904 }, 00:15:24.904 { 00:15:24.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.904 "dma_device_type": 2 00:15:24.904 } 00:15:24.904 ], 00:15:24.904 "driver_specific": { 00:15:24.904 "raid": { 00:15:24.904 "uuid": "f57b80fb-3b81-4586-8b37-47417a6886d6", 00:15:24.904 "strip_size_kb": 64, 00:15:24.904 "state": "online", 00:15:24.904 "raid_level": "raid0", 00:15:24.904 "superblock": true, 00:15:24.904 "num_base_bdevs": 2, 00:15:24.904 "num_base_bdevs_discovered": 2, 00:15:24.904 "num_base_bdevs_operational": 2, 00:15:24.904 "base_bdevs_list": [ 00:15:24.904 { 00:15:24.904 "name": "BaseBdev1", 00:15:24.904 "uuid": "022eb545-8487-4dda-9f0f-4ac0b3477e5d", 00:15:24.904 "is_configured": true, 00:15:24.904 "data_offset": 2048, 00:15:24.904 "data_size": 63488 00:15:24.904 }, 00:15:24.904 { 00:15:24.904 "name": "BaseBdev2", 00:15:24.904 "uuid": "4a8776fe-9a08-4054-a221-7c1a091b4120", 00:15:24.904 "is_configured": true, 00:15:24.904 "data_offset": 2048, 00:15:24.904 "data_size": 63488 00:15:24.904 } 00:15:24.904 ] 00:15:24.904 } 00:15:24.904 } 00:15:24.904 }' 00:15:24.904 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:24.904 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:24.904 BaseBdev2' 00:15:24.904 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:24.904 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:24.904 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.163 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.163 "name": "BaseBdev1", 00:15:25.163 "aliases": [ 00:15:25.163 "022eb545-8487-4dda-9f0f-4ac0b3477e5d" 00:15:25.163 ], 00:15:25.163 "product_name": "Malloc disk", 00:15:25.163 "block_size": 512, 00:15:25.163 "num_blocks": 65536, 00:15:25.163 "uuid": "022eb545-8487-4dda-9f0f-4ac0b3477e5d", 00:15:25.163 "assigned_rate_limits": { 00:15:25.163 "rw_ios_per_sec": 0, 00:15:25.163 "rw_mbytes_per_sec": 0, 00:15:25.163 "r_mbytes_per_sec": 0, 00:15:25.163 "w_mbytes_per_sec": 0 00:15:25.163 }, 00:15:25.163 "claimed": true, 00:15:25.163 "claim_type": "exclusive_write", 00:15:25.163 "zoned": false, 00:15:25.163 "supported_io_types": { 00:15:25.163 "read": true, 00:15:25.163 "write": true, 00:15:25.163 "unmap": true, 00:15:25.163 "flush": true, 00:15:25.163 "reset": true, 00:15:25.163 "nvme_admin": false, 00:15:25.163 "nvme_io": false, 00:15:25.163 "nvme_io_md": false, 00:15:25.163 "write_zeroes": true, 00:15:25.163 "zcopy": true, 00:15:25.163 "get_zone_info": false, 00:15:25.163 "zone_management": false, 00:15:25.163 "zone_append": false, 00:15:25.163 "compare": false, 00:15:25.163 "compare_and_write": false, 00:15:25.163 "abort": true, 00:15:25.163 "seek_hole": false, 00:15:25.163 "seek_data": false, 00:15:25.163 "copy": true, 00:15:25.163 "nvme_iov_md": false 00:15:25.163 }, 00:15:25.163 "memory_domains": [ 00:15:25.163 { 00:15:25.163 "dma_device_id": "system", 00:15:25.163 "dma_device_type": 1 00:15:25.163 }, 00:15:25.163 { 00:15:25.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.163 "dma_device_type": 2 00:15:25.163 } 00:15:25.163 ], 00:15:25.163 "driver_specific": {} 00:15:25.163 }' 00:15:25.163 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.163 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.163 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.163 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.421 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.421 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.421 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.421 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.421 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.422 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.422 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.680 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.680 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.680 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:25.680 17:09:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.680 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.680 "name": "BaseBdev2", 00:15:25.680 "aliases": [ 00:15:25.680 "4a8776fe-9a08-4054-a221-7c1a091b4120" 00:15:25.680 ], 00:15:25.680 "product_name": "Malloc disk", 00:15:25.680 "block_size": 512, 00:15:25.680 "num_blocks": 65536, 00:15:25.680 "uuid": "4a8776fe-9a08-4054-a221-7c1a091b4120", 00:15:25.680 "assigned_rate_limits": { 00:15:25.680 "rw_ios_per_sec": 0, 00:15:25.680 "rw_mbytes_per_sec": 0, 00:15:25.680 "r_mbytes_per_sec": 0, 00:15:25.680 "w_mbytes_per_sec": 0 00:15:25.680 }, 00:15:25.680 "claimed": true, 00:15:25.680 "claim_type": "exclusive_write", 00:15:25.680 "zoned": false, 00:15:25.680 "supported_io_types": { 00:15:25.680 "read": true, 00:15:25.680 "write": true, 00:15:25.680 "unmap": true, 00:15:25.680 "flush": true, 00:15:25.680 "reset": true, 00:15:25.680 "nvme_admin": false, 00:15:25.680 "nvme_io": false, 00:15:25.680 "nvme_io_md": false, 00:15:25.680 "write_zeroes": true, 00:15:25.680 "zcopy": true, 00:15:25.680 "get_zone_info": false, 00:15:25.680 "zone_management": false, 00:15:25.680 "zone_append": false, 00:15:25.680 "compare": false, 00:15:25.680 "compare_and_write": false, 00:15:25.680 "abort": true, 00:15:25.680 "seek_hole": false, 00:15:25.680 "seek_data": false, 00:15:25.680 "copy": true, 00:15:25.680 "nvme_iov_md": false 00:15:25.680 }, 00:15:25.680 "memory_domains": [ 00:15:25.680 { 00:15:25.680 "dma_device_id": "system", 00:15:25.680 "dma_device_type": 1 00:15:25.680 }, 00:15:25.680 { 00:15:25.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.680 "dma_device_type": 2 00:15:25.680 } 00:15:25.680 ], 00:15:25.680 "driver_specific": {} 00:15:25.680 }' 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.939 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.197 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.197 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.197 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.197 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.197 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:26.456 [2024-07-23 17:09:21.689857] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:26.456 [2024-07-23 17:09:21.689881] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:26.456 [2024-07-23 17:09:21.689927] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.456 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.715 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.715 "name": "Existed_Raid", 00:15:26.715 "uuid": "f57b80fb-3b81-4586-8b37-47417a6886d6", 00:15:26.715 "strip_size_kb": 64, 00:15:26.715 "state": "offline", 00:15:26.715 "raid_level": "raid0", 00:15:26.715 "superblock": true, 00:15:26.715 "num_base_bdevs": 2, 00:15:26.715 "num_base_bdevs_discovered": 1, 00:15:26.715 "num_base_bdevs_operational": 1, 00:15:26.715 "base_bdevs_list": [ 00:15:26.715 { 00:15:26.715 "name": null, 00:15:26.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:26.715 "is_configured": false, 00:15:26.715 "data_offset": 2048, 00:15:26.715 "data_size": 63488 00:15:26.715 }, 00:15:26.715 { 00:15:26.715 "name": "BaseBdev2", 00:15:26.715 "uuid": "4a8776fe-9a08-4054-a221-7c1a091b4120", 00:15:26.715 "is_configured": true, 00:15:26.715 "data_offset": 2048, 00:15:26.715 "data_size": 63488 00:15:26.715 } 00:15:26.715 ] 00:15:26.715 }' 00:15:26.715 17:09:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.715 17:09:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.352 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:27.352 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:27.352 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.352 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:27.611 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:27.611 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:27.611 17:09:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:27.870 [2024-07-23 17:09:23.062526] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:27.870 [2024-07-23 17:09:23.062572] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x83f990 name Existed_Raid, state offline 00:15:27.870 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:27.870 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:27.870 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.870 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4108847 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4108847 ']' 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4108847 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4108847 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4108847' 00:15:28.129 killing process with pid 4108847 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4108847 00:15:28.129 [2024-07-23 17:09:23.468055] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:28.129 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4108847 00:15:28.129 [2024-07-23 17:09:23.469032] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:28.388 17:09:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:28.388 00:15:28.388 real 0m10.998s 00:15:28.388 user 0m19.620s 00:15:28.388 sys 0m2.034s 00:15:28.388 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:28.388 17:09:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.388 ************************************ 00:15:28.388 END TEST raid_state_function_test_sb 00:15:28.388 ************************************ 00:15:28.388 17:09:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:28.388 17:09:23 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:15:28.388 17:09:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:28.388 17:09:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:28.388 17:09:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:28.388 ************************************ 00:15:28.388 START TEST raid_superblock_test 00:15:28.388 ************************************ 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4110482 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4110482 /var/tmp/spdk-raid.sock 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:28.388 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4110482 ']' 00:15:28.389 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:28.389 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:28.389 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:28.389 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:28.389 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:28.389 17:09:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.648 [2024-07-23 17:09:23.824924] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:28.648 [2024-07-23 17:09:23.824991] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4110482 ] 00:15:28.648 [2024-07-23 17:09:23.958357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:28.648 [2024-07-23 17:09:24.014649] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:28.908 [2024-07-23 17:09:24.076686] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.908 [2024-07-23 17:09:24.076719] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:29.476 17:09:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:29.735 malloc1 00:15:29.735 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:29.995 [2024-07-23 17:09:25.246919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:29.995 [2024-07-23 17:09:25.246970] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:29.995 [2024-07-23 17:09:25.246994] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28e5070 00:15:29.995 [2024-07-23 17:09:25.247006] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:29.995 [2024-07-23 17:09:25.248665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:29.995 [2024-07-23 17:09:25.248693] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:29.995 pt1 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:29.995 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:30.254 malloc2 00:15:30.254 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:30.514 [2024-07-23 17:09:25.748949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:30.514 [2024-07-23 17:09:25.748996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.514 [2024-07-23 17:09:25.749013] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27cb920 00:15:30.514 [2024-07-23 17:09:25.749026] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.514 [2024-07-23 17:09:25.750748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.514 [2024-07-23 17:09:25.750776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:30.514 pt2 00:15:30.514 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:30.514 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:30.514 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:15:30.773 [2024-07-23 17:09:25.981586] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:30.773 [2024-07-23 17:09:25.982911] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:30.773 [2024-07-23 17:09:25.983060] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28dd3e0 00:15:30.773 [2024-07-23 17:09:25.983073] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:30.773 [2024-07-23 17:09:25.983271] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28de280 00:15:30.773 [2024-07-23 17:09:25.983414] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28dd3e0 00:15:30.773 [2024-07-23 17:09:25.983424] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28dd3e0 00:15:30.773 [2024-07-23 17:09:25.983524] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.773 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:15:30.773 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:30.773 17:09:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:30.773 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.032 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.032 "name": "raid_bdev1", 00:15:31.032 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:31.032 "strip_size_kb": 64, 00:15:31.032 "state": "online", 00:15:31.032 "raid_level": "raid0", 00:15:31.032 "superblock": true, 00:15:31.032 "num_base_bdevs": 2, 00:15:31.032 "num_base_bdevs_discovered": 2, 00:15:31.032 "num_base_bdevs_operational": 2, 00:15:31.032 "base_bdevs_list": [ 00:15:31.032 { 00:15:31.032 "name": "pt1", 00:15:31.032 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:31.032 "is_configured": true, 00:15:31.032 "data_offset": 2048, 00:15:31.032 "data_size": 63488 00:15:31.032 }, 00:15:31.032 { 00:15:31.032 "name": "pt2", 00:15:31.032 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.032 "is_configured": true, 00:15:31.032 "data_offset": 2048, 00:15:31.032 "data_size": 63488 00:15:31.032 } 00:15:31.032 ] 00:15:31.032 }' 00:15:31.032 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.032 17:09:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:31.601 17:09:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:31.860 [2024-07-23 17:09:27.052622] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:31.860 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:31.860 "name": "raid_bdev1", 00:15:31.861 "aliases": [ 00:15:31.861 "b6859c9c-d983-44e5-bf02-94e0f3b35982" 00:15:31.861 ], 00:15:31.861 "product_name": "Raid Volume", 00:15:31.861 "block_size": 512, 00:15:31.861 "num_blocks": 126976, 00:15:31.861 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:31.861 "assigned_rate_limits": { 00:15:31.861 "rw_ios_per_sec": 0, 00:15:31.861 "rw_mbytes_per_sec": 0, 00:15:31.861 "r_mbytes_per_sec": 0, 00:15:31.861 "w_mbytes_per_sec": 0 00:15:31.861 }, 00:15:31.861 "claimed": false, 00:15:31.861 "zoned": false, 00:15:31.861 "supported_io_types": { 00:15:31.861 "read": true, 00:15:31.861 "write": true, 00:15:31.861 "unmap": true, 00:15:31.861 "flush": true, 00:15:31.861 "reset": true, 00:15:31.861 "nvme_admin": false, 00:15:31.861 "nvme_io": false, 00:15:31.861 "nvme_io_md": false, 00:15:31.861 "write_zeroes": true, 00:15:31.861 "zcopy": false, 00:15:31.861 "get_zone_info": false, 00:15:31.861 "zone_management": false, 00:15:31.861 "zone_append": false, 00:15:31.861 "compare": false, 00:15:31.861 "compare_and_write": false, 00:15:31.861 "abort": false, 00:15:31.861 "seek_hole": false, 00:15:31.861 "seek_data": false, 00:15:31.861 "copy": false, 00:15:31.861 "nvme_iov_md": false 00:15:31.861 }, 00:15:31.861 "memory_domains": [ 00:15:31.861 { 00:15:31.861 "dma_device_id": "system", 00:15:31.861 "dma_device_type": 1 00:15:31.861 }, 00:15:31.861 { 00:15:31.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.861 "dma_device_type": 2 00:15:31.861 }, 00:15:31.861 { 00:15:31.861 "dma_device_id": "system", 00:15:31.861 "dma_device_type": 1 00:15:31.861 }, 00:15:31.861 { 00:15:31.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.861 "dma_device_type": 2 00:15:31.861 } 00:15:31.861 ], 00:15:31.861 "driver_specific": { 00:15:31.861 "raid": { 00:15:31.861 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:31.861 "strip_size_kb": 64, 00:15:31.861 "state": "online", 00:15:31.861 "raid_level": "raid0", 00:15:31.861 "superblock": true, 00:15:31.861 "num_base_bdevs": 2, 00:15:31.861 "num_base_bdevs_discovered": 2, 00:15:31.861 "num_base_bdevs_operational": 2, 00:15:31.861 "base_bdevs_list": [ 00:15:31.861 { 00:15:31.861 "name": "pt1", 00:15:31.861 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:31.861 "is_configured": true, 00:15:31.861 "data_offset": 2048, 00:15:31.861 "data_size": 63488 00:15:31.861 }, 00:15:31.861 { 00:15:31.861 "name": "pt2", 00:15:31.861 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:31.861 "is_configured": true, 00:15:31.861 "data_offset": 2048, 00:15:31.861 "data_size": 63488 00:15:31.861 } 00:15:31.861 ] 00:15:31.861 } 00:15:31.861 } 00:15:31.861 }' 00:15:31.861 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:31.861 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:31.861 pt2' 00:15:31.861 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.861 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:31.861 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.120 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.120 "name": "pt1", 00:15:32.120 "aliases": [ 00:15:32.120 "00000000-0000-0000-0000-000000000001" 00:15:32.120 ], 00:15:32.120 "product_name": "passthru", 00:15:32.120 "block_size": 512, 00:15:32.120 "num_blocks": 65536, 00:15:32.120 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:32.120 "assigned_rate_limits": { 00:15:32.120 "rw_ios_per_sec": 0, 00:15:32.120 "rw_mbytes_per_sec": 0, 00:15:32.120 "r_mbytes_per_sec": 0, 00:15:32.120 "w_mbytes_per_sec": 0 00:15:32.120 }, 00:15:32.120 "claimed": true, 00:15:32.120 "claim_type": "exclusive_write", 00:15:32.120 "zoned": false, 00:15:32.120 "supported_io_types": { 00:15:32.120 "read": true, 00:15:32.121 "write": true, 00:15:32.121 "unmap": true, 00:15:32.121 "flush": true, 00:15:32.121 "reset": true, 00:15:32.121 "nvme_admin": false, 00:15:32.121 "nvme_io": false, 00:15:32.121 "nvme_io_md": false, 00:15:32.121 "write_zeroes": true, 00:15:32.121 "zcopy": true, 00:15:32.121 "get_zone_info": false, 00:15:32.121 "zone_management": false, 00:15:32.121 "zone_append": false, 00:15:32.121 "compare": false, 00:15:32.121 "compare_and_write": false, 00:15:32.121 "abort": true, 00:15:32.121 "seek_hole": false, 00:15:32.121 "seek_data": false, 00:15:32.121 "copy": true, 00:15:32.121 "nvme_iov_md": false 00:15:32.121 }, 00:15:32.121 "memory_domains": [ 00:15:32.121 { 00:15:32.121 "dma_device_id": "system", 00:15:32.121 "dma_device_type": 1 00:15:32.121 }, 00:15:32.121 { 00:15:32.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.121 "dma_device_type": 2 00:15:32.121 } 00:15:32.121 ], 00:15:32.121 "driver_specific": { 00:15:32.121 "passthru": { 00:15:32.121 "name": "pt1", 00:15:32.121 "base_bdev_name": "malloc1" 00:15:32.121 } 00:15:32.121 } 00:15:32.121 }' 00:15:32.121 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.121 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.121 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.121 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.121 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:32.380 17:09:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.639 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.639 "name": "pt2", 00:15:32.639 "aliases": [ 00:15:32.639 "00000000-0000-0000-0000-000000000002" 00:15:32.639 ], 00:15:32.639 "product_name": "passthru", 00:15:32.639 "block_size": 512, 00:15:32.639 "num_blocks": 65536, 00:15:32.639 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:32.639 "assigned_rate_limits": { 00:15:32.639 "rw_ios_per_sec": 0, 00:15:32.639 "rw_mbytes_per_sec": 0, 00:15:32.639 "r_mbytes_per_sec": 0, 00:15:32.639 "w_mbytes_per_sec": 0 00:15:32.639 }, 00:15:32.639 "claimed": true, 00:15:32.639 "claim_type": "exclusive_write", 00:15:32.639 "zoned": false, 00:15:32.639 "supported_io_types": { 00:15:32.639 "read": true, 00:15:32.639 "write": true, 00:15:32.639 "unmap": true, 00:15:32.639 "flush": true, 00:15:32.639 "reset": true, 00:15:32.639 "nvme_admin": false, 00:15:32.639 "nvme_io": false, 00:15:32.639 "nvme_io_md": false, 00:15:32.639 "write_zeroes": true, 00:15:32.639 "zcopy": true, 00:15:32.639 "get_zone_info": false, 00:15:32.639 "zone_management": false, 00:15:32.639 "zone_append": false, 00:15:32.639 "compare": false, 00:15:32.639 "compare_and_write": false, 00:15:32.639 "abort": true, 00:15:32.639 "seek_hole": false, 00:15:32.639 "seek_data": false, 00:15:32.639 "copy": true, 00:15:32.639 "nvme_iov_md": false 00:15:32.639 }, 00:15:32.639 "memory_domains": [ 00:15:32.639 { 00:15:32.639 "dma_device_id": "system", 00:15:32.639 "dma_device_type": 1 00:15:32.639 }, 00:15:32.639 { 00:15:32.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.639 "dma_device_type": 2 00:15:32.639 } 00:15:32.639 ], 00:15:32.639 "driver_specific": { 00:15:32.639 "passthru": { 00:15:32.639 "name": "pt2", 00:15:32.639 "base_bdev_name": "malloc2" 00:15:32.639 } 00:15:32.639 } 00:15:32.639 }' 00:15:32.639 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.639 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.898 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.898 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.898 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.898 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.898 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.898 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.157 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.157 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.157 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.157 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.157 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:33.157 17:09:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:33.725 [2024-07-23 17:09:29.013813] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.725 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b6859c9c-d983-44e5-bf02-94e0f3b35982 00:15:33.725 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b6859c9c-d983-44e5-bf02-94e0f3b35982 ']' 00:15:33.725 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:33.984 [2024-07-23 17:09:29.274247] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:33.984 [2024-07-23 17:09:29.274268] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:33.984 [2024-07-23 17:09:29.274322] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:33.984 [2024-07-23 17:09:29.274365] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:33.984 [2024-07-23 17:09:29.274377] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28dd3e0 name raid_bdev1, state offline 00:15:33.984 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.984 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:34.243 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:34.243 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:34.243 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:34.243 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:34.502 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:34.502 17:09:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:34.761 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:34.761 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:35.020 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:35.021 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:15:35.280 [2024-07-23 17:09:30.561595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:35.280 [2024-07-23 17:09:30.562965] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:35.280 [2024-07-23 17:09:30.563021] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:35.280 [2024-07-23 17:09:30.563059] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:35.280 [2024-07-23 17:09:30.563078] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:35.280 [2024-07-23 17:09:30.563087] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27cb090 name raid_bdev1, state configuring 00:15:35.280 request: 00:15:35.280 { 00:15:35.280 "name": "raid_bdev1", 00:15:35.280 "raid_level": "raid0", 00:15:35.280 "base_bdevs": [ 00:15:35.280 "malloc1", 00:15:35.280 "malloc2" 00:15:35.280 ], 00:15:35.280 "strip_size_kb": 64, 00:15:35.280 "superblock": false, 00:15:35.280 "method": "bdev_raid_create", 00:15:35.280 "req_id": 1 00:15:35.280 } 00:15:35.280 Got JSON-RPC error response 00:15:35.280 response: 00:15:35.280 { 00:15:35.280 "code": -17, 00:15:35.280 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:35.280 } 00:15:35.280 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:35.280 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:35.280 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:35.280 17:09:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:35.280 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.280 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:35.539 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:35.539 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:35.539 17:09:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:35.798 [2024-07-23 17:09:31.058853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:35.798 [2024-07-23 17:09:31.058905] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.798 [2024-07-23 17:09:31.058923] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2731da0 00:15:35.798 [2024-07-23 17:09:31.058935] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.798 [2024-07-23 17:09:31.060560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.798 [2024-07-23 17:09:31.060588] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:35.798 [2024-07-23 17:09:31.060658] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:35.798 [2024-07-23 17:09:31.060683] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:35.798 pt1 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.798 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.367 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.367 "name": "raid_bdev1", 00:15:36.367 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:36.367 "strip_size_kb": 64, 00:15:36.367 "state": "configuring", 00:15:36.367 "raid_level": "raid0", 00:15:36.367 "superblock": true, 00:15:36.367 "num_base_bdevs": 2, 00:15:36.367 "num_base_bdevs_discovered": 1, 00:15:36.367 "num_base_bdevs_operational": 2, 00:15:36.367 "base_bdevs_list": [ 00:15:36.367 { 00:15:36.367 "name": "pt1", 00:15:36.367 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:36.367 "is_configured": true, 00:15:36.367 "data_offset": 2048, 00:15:36.367 "data_size": 63488 00:15:36.367 }, 00:15:36.367 { 00:15:36.367 "name": null, 00:15:36.367 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:36.367 "is_configured": false, 00:15:36.367 "data_offset": 2048, 00:15:36.367 "data_size": 63488 00:15:36.367 } 00:15:36.367 ] 00:15:36.367 }' 00:15:36.367 17:09:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.367 17:09:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:37.305 [2024-07-23 17:09:32.687168] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:37.305 [2024-07-23 17:09:32.687215] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.305 [2024-07-23 17:09:32.687234] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2734b00 00:15:37.305 [2024-07-23 17:09:32.687246] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.305 [2024-07-23 17:09:32.687580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.305 [2024-07-23 17:09:32.687598] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:37.305 [2024-07-23 17:09:32.687657] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:37.305 [2024-07-23 17:09:32.687677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:37.305 [2024-07-23 17:09:32.687768] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28de5d0 00:15:37.305 [2024-07-23 17:09:32.687778] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:37.305 [2024-07-23 17:09:32.687952] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28df3f0 00:15:37.305 [2024-07-23 17:09:32.688079] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28de5d0 00:15:37.305 [2024-07-23 17:09:32.688089] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28de5d0 00:15:37.305 [2024-07-23 17:09:32.688190] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:37.305 pt2 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:37.305 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.564 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.564 "name": "raid_bdev1", 00:15:37.564 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:37.564 "strip_size_kb": 64, 00:15:37.564 "state": "online", 00:15:37.564 "raid_level": "raid0", 00:15:37.564 "superblock": true, 00:15:37.564 "num_base_bdevs": 2, 00:15:37.564 "num_base_bdevs_discovered": 2, 00:15:37.564 "num_base_bdevs_operational": 2, 00:15:37.564 "base_bdevs_list": [ 00:15:37.564 { 00:15:37.564 "name": "pt1", 00:15:37.564 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.564 "is_configured": true, 00:15:37.564 "data_offset": 2048, 00:15:37.564 "data_size": 63488 00:15:37.564 }, 00:15:37.564 { 00:15:37.564 "name": "pt2", 00:15:37.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.564 "is_configured": true, 00:15:37.564 "data_offset": 2048, 00:15:37.564 "data_size": 63488 00:15:37.564 } 00:15:37.564 ] 00:15:37.564 }' 00:15:37.564 17:09:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.564 17:09:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:38.508 [2024-07-23 17:09:33.766263] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:38.508 "name": "raid_bdev1", 00:15:38.508 "aliases": [ 00:15:38.508 "b6859c9c-d983-44e5-bf02-94e0f3b35982" 00:15:38.508 ], 00:15:38.508 "product_name": "Raid Volume", 00:15:38.508 "block_size": 512, 00:15:38.508 "num_blocks": 126976, 00:15:38.508 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:38.508 "assigned_rate_limits": { 00:15:38.508 "rw_ios_per_sec": 0, 00:15:38.508 "rw_mbytes_per_sec": 0, 00:15:38.508 "r_mbytes_per_sec": 0, 00:15:38.508 "w_mbytes_per_sec": 0 00:15:38.508 }, 00:15:38.508 "claimed": false, 00:15:38.508 "zoned": false, 00:15:38.508 "supported_io_types": { 00:15:38.508 "read": true, 00:15:38.508 "write": true, 00:15:38.508 "unmap": true, 00:15:38.508 "flush": true, 00:15:38.508 "reset": true, 00:15:38.508 "nvme_admin": false, 00:15:38.508 "nvme_io": false, 00:15:38.508 "nvme_io_md": false, 00:15:38.508 "write_zeroes": true, 00:15:38.508 "zcopy": false, 00:15:38.508 "get_zone_info": false, 00:15:38.508 "zone_management": false, 00:15:38.508 "zone_append": false, 00:15:38.508 "compare": false, 00:15:38.508 "compare_and_write": false, 00:15:38.508 "abort": false, 00:15:38.508 "seek_hole": false, 00:15:38.508 "seek_data": false, 00:15:38.508 "copy": false, 00:15:38.508 "nvme_iov_md": false 00:15:38.508 }, 00:15:38.508 "memory_domains": [ 00:15:38.508 { 00:15:38.508 "dma_device_id": "system", 00:15:38.508 "dma_device_type": 1 00:15:38.508 }, 00:15:38.508 { 00:15:38.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.508 "dma_device_type": 2 00:15:38.508 }, 00:15:38.508 { 00:15:38.508 "dma_device_id": "system", 00:15:38.508 "dma_device_type": 1 00:15:38.508 }, 00:15:38.508 { 00:15:38.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.508 "dma_device_type": 2 00:15:38.508 } 00:15:38.508 ], 00:15:38.508 "driver_specific": { 00:15:38.508 "raid": { 00:15:38.508 "uuid": "b6859c9c-d983-44e5-bf02-94e0f3b35982", 00:15:38.508 "strip_size_kb": 64, 00:15:38.508 "state": "online", 00:15:38.508 "raid_level": "raid0", 00:15:38.508 "superblock": true, 00:15:38.508 "num_base_bdevs": 2, 00:15:38.508 "num_base_bdevs_discovered": 2, 00:15:38.508 "num_base_bdevs_operational": 2, 00:15:38.508 "base_bdevs_list": [ 00:15:38.508 { 00:15:38.508 "name": "pt1", 00:15:38.508 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.508 "is_configured": true, 00:15:38.508 "data_offset": 2048, 00:15:38.508 "data_size": 63488 00:15:38.508 }, 00:15:38.508 { 00:15:38.508 "name": "pt2", 00:15:38.508 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.508 "is_configured": true, 00:15:38.508 "data_offset": 2048, 00:15:38.508 "data_size": 63488 00:15:38.508 } 00:15:38.508 ] 00:15:38.508 } 00:15:38.508 } 00:15:38.508 }' 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:38.508 pt2' 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:38.508 17:09:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.767 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.767 "name": "pt1", 00:15:38.767 "aliases": [ 00:15:38.767 "00000000-0000-0000-0000-000000000001" 00:15:38.767 ], 00:15:38.767 "product_name": "passthru", 00:15:38.767 "block_size": 512, 00:15:38.767 "num_blocks": 65536, 00:15:38.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.767 "assigned_rate_limits": { 00:15:38.767 "rw_ios_per_sec": 0, 00:15:38.767 "rw_mbytes_per_sec": 0, 00:15:38.767 "r_mbytes_per_sec": 0, 00:15:38.767 "w_mbytes_per_sec": 0 00:15:38.767 }, 00:15:38.767 "claimed": true, 00:15:38.767 "claim_type": "exclusive_write", 00:15:38.767 "zoned": false, 00:15:38.767 "supported_io_types": { 00:15:38.767 "read": true, 00:15:38.767 "write": true, 00:15:38.767 "unmap": true, 00:15:38.767 "flush": true, 00:15:38.767 "reset": true, 00:15:38.767 "nvme_admin": false, 00:15:38.767 "nvme_io": false, 00:15:38.767 "nvme_io_md": false, 00:15:38.767 "write_zeroes": true, 00:15:38.767 "zcopy": true, 00:15:38.767 "get_zone_info": false, 00:15:38.767 "zone_management": false, 00:15:38.767 "zone_append": false, 00:15:38.767 "compare": false, 00:15:38.767 "compare_and_write": false, 00:15:38.767 "abort": true, 00:15:38.767 "seek_hole": false, 00:15:38.767 "seek_data": false, 00:15:38.767 "copy": true, 00:15:38.767 "nvme_iov_md": false 00:15:38.767 }, 00:15:38.767 "memory_domains": [ 00:15:38.767 { 00:15:38.767 "dma_device_id": "system", 00:15:38.767 "dma_device_type": 1 00:15:38.767 }, 00:15:38.767 { 00:15:38.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.767 "dma_device_type": 2 00:15:38.767 } 00:15:38.767 ], 00:15:38.767 "driver_specific": { 00:15:38.767 "passthru": { 00:15:38.767 "name": "pt1", 00:15:38.767 "base_bdev_name": "malloc1" 00:15:38.767 } 00:15:38.767 } 00:15:38.767 }' 00:15:38.767 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.767 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.767 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.767 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:39.026 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.286 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.286 "name": "pt2", 00:15:39.286 "aliases": [ 00:15:39.286 "00000000-0000-0000-0000-000000000002" 00:15:39.286 ], 00:15:39.286 "product_name": "passthru", 00:15:39.286 "block_size": 512, 00:15:39.286 "num_blocks": 65536, 00:15:39.286 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.286 "assigned_rate_limits": { 00:15:39.286 "rw_ios_per_sec": 0, 00:15:39.286 "rw_mbytes_per_sec": 0, 00:15:39.286 "r_mbytes_per_sec": 0, 00:15:39.286 "w_mbytes_per_sec": 0 00:15:39.286 }, 00:15:39.286 "claimed": true, 00:15:39.286 "claim_type": "exclusive_write", 00:15:39.286 "zoned": false, 00:15:39.286 "supported_io_types": { 00:15:39.286 "read": true, 00:15:39.286 "write": true, 00:15:39.286 "unmap": true, 00:15:39.286 "flush": true, 00:15:39.286 "reset": true, 00:15:39.286 "nvme_admin": false, 00:15:39.286 "nvme_io": false, 00:15:39.286 "nvme_io_md": false, 00:15:39.286 "write_zeroes": true, 00:15:39.286 "zcopy": true, 00:15:39.286 "get_zone_info": false, 00:15:39.286 "zone_management": false, 00:15:39.286 "zone_append": false, 00:15:39.286 "compare": false, 00:15:39.286 "compare_and_write": false, 00:15:39.286 "abort": true, 00:15:39.286 "seek_hole": false, 00:15:39.286 "seek_data": false, 00:15:39.286 "copy": true, 00:15:39.286 "nvme_iov_md": false 00:15:39.286 }, 00:15:39.286 "memory_domains": [ 00:15:39.286 { 00:15:39.286 "dma_device_id": "system", 00:15:39.286 "dma_device_type": 1 00:15:39.286 }, 00:15:39.286 { 00:15:39.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.286 "dma_device_type": 2 00:15:39.286 } 00:15:39.286 ], 00:15:39.286 "driver_specific": { 00:15:39.286 "passthru": { 00:15:39.286 "name": "pt2", 00:15:39.286 "base_bdev_name": "malloc2" 00:15:39.286 } 00:15:39.286 } 00:15:39.286 }' 00:15:39.286 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.545 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.804 17:09:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.804 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.804 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:39.804 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:40.063 [2024-07-23 17:09:35.258222] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b6859c9c-d983-44e5-bf02-94e0f3b35982 '!=' b6859c9c-d983-44e5-bf02-94e0f3b35982 ']' 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4110482 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4110482 ']' 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4110482 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4110482 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4110482' 00:15:40.063 killing process with pid 4110482 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4110482 00:15:40.063 [2024-07-23 17:09:35.328751] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:40.063 [2024-07-23 17:09:35.328805] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:40.063 [2024-07-23 17:09:35.328847] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:40.063 [2024-07-23 17:09:35.328857] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28de5d0 name raid_bdev1, state offline 00:15:40.063 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4110482 00:15:40.063 [2024-07-23 17:09:35.348057] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:40.323 17:09:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:40.323 00:15:40.323 real 0m11.798s 00:15:40.323 user 0m21.155s 00:15:40.323 sys 0m2.166s 00:15:40.323 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:40.323 17:09:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.323 ************************************ 00:15:40.323 END TEST raid_superblock_test 00:15:40.323 ************************************ 00:15:40.323 17:09:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:40.323 17:09:35 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:15:40.323 17:09:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:40.323 17:09:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:40.323 17:09:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:40.323 ************************************ 00:15:40.323 START TEST raid_read_error_test 00:15:40.323 ************************************ 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.r2K018YWhq 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4112212 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4112212 /var/tmp/spdk-raid.sock 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4112212 ']' 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:40.323 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:40.323 17:09:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.323 [2024-07-23 17:09:35.722060] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:40.323 [2024-07-23 17:09:35.722127] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4112212 ] 00:15:40.583 [2024-07-23 17:09:35.854972] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:40.583 [2024-07-23 17:09:35.910726] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:40.583 [2024-07-23 17:09:35.977053] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:40.583 [2024-07-23 17:09:35.977095] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:41.519 17:09:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:41.519 17:09:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:41.519 17:09:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:41.519 17:09:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:41.519 BaseBdev1_malloc 00:15:41.519 17:09:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:41.841 true 00:15:41.841 17:09:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:42.101 [2024-07-23 17:09:37.370621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:42.101 [2024-07-23 17:09:37.370668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.101 [2024-07-23 17:09:37.370689] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27d95c0 00:15:42.101 [2024-07-23 17:09:37.370702] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.101 [2024-07-23 17:09:37.372393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.101 [2024-07-23 17:09:37.372421] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:42.101 BaseBdev1 00:15:42.101 17:09:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:42.101 17:09:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:42.360 BaseBdev2_malloc 00:15:42.360 17:09:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:42.619 true 00:15:42.619 17:09:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:42.878 [2024-07-23 17:09:38.106023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:42.878 [2024-07-23 17:09:38.106065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.878 [2024-07-23 17:09:38.106084] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27d3620 00:15:42.878 [2024-07-23 17:09:38.106096] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.878 [2024-07-23 17:09:38.107501] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.878 [2024-07-23 17:09:38.107528] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:42.878 BaseBdev2 00:15:42.878 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:43.138 [2024-07-23 17:09:38.366746] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:43.138 [2024-07-23 17:09:38.368106] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:43.138 [2024-07-23 17:09:38.368284] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2627610 00:15:43.138 [2024-07-23 17:09:38.368297] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:43.138 [2024-07-23 17:09:38.368495] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c1a80 00:15:43.138 [2024-07-23 17:09:38.368642] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2627610 00:15:43.138 [2024-07-23 17:09:38.368652] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2627610 00:15:43.138 [2024-07-23 17:09:38.368761] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.138 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:43.397 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.397 "name": "raid_bdev1", 00:15:43.397 "uuid": "19c760be-42e7-4f25-8f4e-c72b5d7acff1", 00:15:43.397 "strip_size_kb": 64, 00:15:43.397 "state": "online", 00:15:43.397 "raid_level": "raid0", 00:15:43.397 "superblock": true, 00:15:43.397 "num_base_bdevs": 2, 00:15:43.397 "num_base_bdevs_discovered": 2, 00:15:43.397 "num_base_bdevs_operational": 2, 00:15:43.397 "base_bdevs_list": [ 00:15:43.397 { 00:15:43.397 "name": "BaseBdev1", 00:15:43.397 "uuid": "17cecb02-913c-5449-b8e3-4efc7d28d8d8", 00:15:43.397 "is_configured": true, 00:15:43.397 "data_offset": 2048, 00:15:43.397 "data_size": 63488 00:15:43.397 }, 00:15:43.397 { 00:15:43.397 "name": "BaseBdev2", 00:15:43.397 "uuid": "f47bd1fc-39d8-5f96-855b-84daecfa251d", 00:15:43.397 "is_configured": true, 00:15:43.397 "data_offset": 2048, 00:15:43.397 "data_size": 63488 00:15:43.397 } 00:15:43.397 ] 00:15:43.397 }' 00:15:43.397 17:09:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.397 17:09:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.965 17:09:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:43.965 17:09:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:43.965 [2024-07-23 17:09:39.317528] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c1930 00:15:44.902 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.161 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:45.421 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.421 "name": "raid_bdev1", 00:15:45.421 "uuid": "19c760be-42e7-4f25-8f4e-c72b5d7acff1", 00:15:45.421 "strip_size_kb": 64, 00:15:45.421 "state": "online", 00:15:45.421 "raid_level": "raid0", 00:15:45.421 "superblock": true, 00:15:45.421 "num_base_bdevs": 2, 00:15:45.421 "num_base_bdevs_discovered": 2, 00:15:45.421 "num_base_bdevs_operational": 2, 00:15:45.421 "base_bdevs_list": [ 00:15:45.421 { 00:15:45.421 "name": "BaseBdev1", 00:15:45.421 "uuid": "17cecb02-913c-5449-b8e3-4efc7d28d8d8", 00:15:45.421 "is_configured": true, 00:15:45.421 "data_offset": 2048, 00:15:45.421 "data_size": 63488 00:15:45.421 }, 00:15:45.421 { 00:15:45.421 "name": "BaseBdev2", 00:15:45.421 "uuid": "f47bd1fc-39d8-5f96-855b-84daecfa251d", 00:15:45.421 "is_configured": true, 00:15:45.421 "data_offset": 2048, 00:15:45.421 "data_size": 63488 00:15:45.421 } 00:15:45.421 ] 00:15:45.421 }' 00:15:45.421 17:09:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.421 17:09:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.990 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:46.250 [2024-07-23 17:09:41.437967] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:46.250 [2024-07-23 17:09:41.438004] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:46.250 [2024-07-23 17:09:41.441175] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:46.250 [2024-07-23 17:09:41.441206] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:46.250 [2024-07-23 17:09:41.441234] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:46.250 [2024-07-23 17:09:41.441244] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2627610 name raid_bdev1, state offline 00:15:46.250 0 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4112212 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4112212 ']' 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4112212 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4112212 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4112212' 00:15:46.250 killing process with pid 4112212 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4112212 00:15:46.250 [2024-07-23 17:09:41.522447] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:46.250 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4112212 00:15:46.250 [2024-07-23 17:09:41.533578] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.r2K018YWhq 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:15:46.510 00:15:46.510 real 0m6.116s 00:15:46.510 user 0m9.472s 00:15:46.510 sys 0m1.117s 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:46.510 17:09:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.510 ************************************ 00:15:46.510 END TEST raid_read_error_test 00:15:46.510 ************************************ 00:15:46.510 17:09:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:46.510 17:09:41 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:15:46.510 17:09:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:46.510 17:09:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:46.510 17:09:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:46.510 ************************************ 00:15:46.510 START TEST raid_write_error_test 00:15:46.510 ************************************ 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.XsZJqoBSW3 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4113096 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4113096 /var/tmp/spdk-raid.sock 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4113096 ']' 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:46.510 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:46.510 17:09:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:46.510 [2024-07-23 17:09:41.929502] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:46.510 [2024-07-23 17:09:41.929577] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4113096 ] 00:15:46.770 [2024-07-23 17:09:42.065566] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:46.770 [2024-07-23 17:09:42.120528] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:46.770 [2024-07-23 17:09:42.183990] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.770 [2024-07-23 17:09:42.184030] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:47.707 17:09:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:47.707 17:09:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:47.707 17:09:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:47.707 17:09:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:47.707 BaseBdev1_malloc 00:15:47.707 17:09:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:47.966 true 00:15:47.966 17:09:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:48.224 [2024-07-23 17:09:43.594666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:48.224 [2024-07-23 17:09:43.594714] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.224 [2024-07-23 17:09:43.594735] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x214d5c0 00:15:48.224 [2024-07-23 17:09:43.594747] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.224 [2024-07-23 17:09:43.596278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.224 [2024-07-23 17:09:43.596306] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:48.224 BaseBdev1 00:15:48.224 17:09:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:48.224 17:09:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:48.482 BaseBdev2_malloc 00:15:48.482 17:09:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:48.741 true 00:15:48.741 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:48.999 [2024-07-23 17:09:44.349418] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:48.999 [2024-07-23 17:09:44.349463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.999 [2024-07-23 17:09:44.349484] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2147620 00:15:48.999 [2024-07-23 17:09:44.349496] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.999 [2024-07-23 17:09:44.350868] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.999 [2024-07-23 17:09:44.350902] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:48.999 BaseBdev2 00:15:48.999 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:49.258 [2024-07-23 17:09:44.590093] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:49.258 [2024-07-23 17:09:44.591261] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:49.258 [2024-07-23 17:09:44.591428] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f9b610 00:15:49.258 [2024-07-23 17:09:44.591441] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:15:49.258 [2024-07-23 17:09:44.591620] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2035a80 00:15:49.258 [2024-07-23 17:09:44.591758] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f9b610 00:15:49.258 [2024-07-23 17:09:44.591768] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f9b610 00:15:49.258 [2024-07-23 17:09:44.591867] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.258 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:49.517 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.517 "name": "raid_bdev1", 00:15:49.517 "uuid": "5480fa33-c8f0-49d3-ab9a-cc6dd72707a0", 00:15:49.517 "strip_size_kb": 64, 00:15:49.517 "state": "online", 00:15:49.517 "raid_level": "raid0", 00:15:49.517 "superblock": true, 00:15:49.517 "num_base_bdevs": 2, 00:15:49.517 "num_base_bdevs_discovered": 2, 00:15:49.517 "num_base_bdevs_operational": 2, 00:15:49.517 "base_bdevs_list": [ 00:15:49.517 { 00:15:49.517 "name": "BaseBdev1", 00:15:49.517 "uuid": "d039b76a-7cdd-5b71-a164-4b65e2613130", 00:15:49.517 "is_configured": true, 00:15:49.517 "data_offset": 2048, 00:15:49.517 "data_size": 63488 00:15:49.517 }, 00:15:49.517 { 00:15:49.517 "name": "BaseBdev2", 00:15:49.517 "uuid": "8749609a-c1b3-5ddf-b4bb-a70dedb0d57b", 00:15:49.517 "is_configured": true, 00:15:49.517 "data_offset": 2048, 00:15:49.517 "data_size": 63488 00:15:49.517 } 00:15:49.517 ] 00:15:49.517 }' 00:15:49.517 17:09:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.517 17:09:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.085 17:09:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:50.085 17:09:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:50.344 [2024-07-23 17:09:45.601136] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2035930 00:15:51.281 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.541 17:09:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:51.800 17:09:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.800 "name": "raid_bdev1", 00:15:51.800 "uuid": "5480fa33-c8f0-49d3-ab9a-cc6dd72707a0", 00:15:51.800 "strip_size_kb": 64, 00:15:51.800 "state": "online", 00:15:51.800 "raid_level": "raid0", 00:15:51.800 "superblock": true, 00:15:51.800 "num_base_bdevs": 2, 00:15:51.800 "num_base_bdevs_discovered": 2, 00:15:51.800 "num_base_bdevs_operational": 2, 00:15:51.800 "base_bdevs_list": [ 00:15:51.800 { 00:15:51.800 "name": "BaseBdev1", 00:15:51.800 "uuid": "d039b76a-7cdd-5b71-a164-4b65e2613130", 00:15:51.800 "is_configured": true, 00:15:51.800 "data_offset": 2048, 00:15:51.800 "data_size": 63488 00:15:51.800 }, 00:15:51.800 { 00:15:51.800 "name": "BaseBdev2", 00:15:51.800 "uuid": "8749609a-c1b3-5ddf-b4bb-a70dedb0d57b", 00:15:51.800 "is_configured": true, 00:15:51.800 "data_offset": 2048, 00:15:51.800 "data_size": 63488 00:15:51.800 } 00:15:51.800 ] 00:15:51.800 }' 00:15:51.800 17:09:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.800 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.369 17:09:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:52.628 [2024-07-23 17:09:47.805873] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:52.628 [2024-07-23 17:09:47.805921] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:52.628 [2024-07-23 17:09:47.809086] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:52.628 [2024-07-23 17:09:47.809119] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.628 [2024-07-23 17:09:47.809147] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:52.629 [2024-07-23 17:09:47.809158] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f9b610 name raid_bdev1, state offline 00:15:52.629 0 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4113096 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4113096 ']' 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4113096 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4113096 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4113096' 00:15:52.629 killing process with pid 4113096 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4113096 00:15:52.629 [2024-07-23 17:09:47.890333] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:52.629 17:09:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4113096 00:15:52.629 [2024-07-23 17:09:47.901410] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.XsZJqoBSW3 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:52.888 00:15:52.888 real 0m6.278s 00:15:52.888 user 0m9.839s 00:15:52.888 sys 0m1.089s 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:52.888 17:09:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.888 ************************************ 00:15:52.888 END TEST raid_write_error_test 00:15:52.888 ************************************ 00:15:52.888 17:09:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:52.888 17:09:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:52.888 17:09:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:15:52.888 17:09:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:52.888 17:09:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:52.888 17:09:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:52.888 ************************************ 00:15:52.888 START TEST raid_state_function_test 00:15:52.888 ************************************ 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:52.888 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4114059 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4114059' 00:15:52.889 Process raid pid: 4114059 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4114059 /var/tmp/spdk-raid.sock 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4114059 ']' 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:52.889 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:52.889 17:09:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:52.889 [2024-07-23 17:09:48.289610] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:15:52.889 [2024-07-23 17:09:48.289680] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:53.148 [2024-07-23 17:09:48.423069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:53.148 [2024-07-23 17:09:48.479068] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:53.148 [2024-07-23 17:09:48.543385] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:53.148 [2024-07-23 17:09:48.543421] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:54.085 17:09:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:54.085 17:09:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:54.085 17:09:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:54.653 [2024-07-23 17:09:49.979523] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:54.653 [2024-07-23 17:09:49.979563] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:54.653 [2024-07-23 17:09:49.979574] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:54.653 [2024-07-23 17:09:49.979585] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.653 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:54.911 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.911 "name": "Existed_Raid", 00:15:54.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:54.911 "strip_size_kb": 64, 00:15:54.911 "state": "configuring", 00:15:54.911 "raid_level": "concat", 00:15:54.911 "superblock": false, 00:15:54.911 "num_base_bdevs": 2, 00:15:54.911 "num_base_bdevs_discovered": 0, 00:15:54.911 "num_base_bdevs_operational": 2, 00:15:54.911 "base_bdevs_list": [ 00:15:54.911 { 00:15:54.911 "name": "BaseBdev1", 00:15:54.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:54.911 "is_configured": false, 00:15:54.911 "data_offset": 0, 00:15:54.911 "data_size": 0 00:15:54.911 }, 00:15:54.911 { 00:15:54.911 "name": "BaseBdev2", 00:15:54.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:54.911 "is_configured": false, 00:15:54.911 "data_offset": 0, 00:15:54.911 "data_size": 0 00:15:54.911 } 00:15:54.911 ] 00:15:54.911 }' 00:15:54.911 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.911 17:09:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.479 17:09:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:55.737 [2024-07-23 17:09:51.086320] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:55.737 [2024-07-23 17:09:51.086352] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e89410 name Existed_Raid, state configuring 00:15:55.737 17:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:55.996 [2024-07-23 17:09:51.334992] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:55.996 [2024-07-23 17:09:51.335022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:55.996 [2024-07-23 17:09:51.335032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:55.996 [2024-07-23 17:09:51.335043] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:55.996 17:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:56.256 [2024-07-23 17:09:51.589520] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:56.256 BaseBdev1 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:56.256 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:56.519 17:09:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:56.810 [ 00:15:56.810 { 00:15:56.810 "name": "BaseBdev1", 00:15:56.810 "aliases": [ 00:15:56.810 "093faacb-fbb3-43fe-8444-511b017fdb92" 00:15:56.810 ], 00:15:56.810 "product_name": "Malloc disk", 00:15:56.810 "block_size": 512, 00:15:56.810 "num_blocks": 65536, 00:15:56.810 "uuid": "093faacb-fbb3-43fe-8444-511b017fdb92", 00:15:56.810 "assigned_rate_limits": { 00:15:56.810 "rw_ios_per_sec": 0, 00:15:56.810 "rw_mbytes_per_sec": 0, 00:15:56.810 "r_mbytes_per_sec": 0, 00:15:56.810 "w_mbytes_per_sec": 0 00:15:56.810 }, 00:15:56.810 "claimed": true, 00:15:56.810 "claim_type": "exclusive_write", 00:15:56.810 "zoned": false, 00:15:56.810 "supported_io_types": { 00:15:56.810 "read": true, 00:15:56.810 "write": true, 00:15:56.810 "unmap": true, 00:15:56.810 "flush": true, 00:15:56.810 "reset": true, 00:15:56.810 "nvme_admin": false, 00:15:56.810 "nvme_io": false, 00:15:56.810 "nvme_io_md": false, 00:15:56.810 "write_zeroes": true, 00:15:56.810 "zcopy": true, 00:15:56.810 "get_zone_info": false, 00:15:56.810 "zone_management": false, 00:15:56.810 "zone_append": false, 00:15:56.810 "compare": false, 00:15:56.810 "compare_and_write": false, 00:15:56.810 "abort": true, 00:15:56.810 "seek_hole": false, 00:15:56.810 "seek_data": false, 00:15:56.810 "copy": true, 00:15:56.810 "nvme_iov_md": false 00:15:56.810 }, 00:15:56.810 "memory_domains": [ 00:15:56.810 { 00:15:56.810 "dma_device_id": "system", 00:15:56.810 "dma_device_type": 1 00:15:56.810 }, 00:15:56.810 { 00:15:56.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.810 "dma_device_type": 2 00:15:56.810 } 00:15:56.810 ], 00:15:56.810 "driver_specific": {} 00:15:56.810 } 00:15:56.810 ] 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.810 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.069 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.069 "name": "Existed_Raid", 00:15:57.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.069 "strip_size_kb": 64, 00:15:57.069 "state": "configuring", 00:15:57.069 "raid_level": "concat", 00:15:57.069 "superblock": false, 00:15:57.069 "num_base_bdevs": 2, 00:15:57.069 "num_base_bdevs_discovered": 1, 00:15:57.069 "num_base_bdevs_operational": 2, 00:15:57.069 "base_bdevs_list": [ 00:15:57.069 { 00:15:57.069 "name": "BaseBdev1", 00:15:57.069 "uuid": "093faacb-fbb3-43fe-8444-511b017fdb92", 00:15:57.069 "is_configured": true, 00:15:57.069 "data_offset": 0, 00:15:57.069 "data_size": 65536 00:15:57.069 }, 00:15:57.069 { 00:15:57.069 "name": "BaseBdev2", 00:15:57.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.069 "is_configured": false, 00:15:57.069 "data_offset": 0, 00:15:57.069 "data_size": 0 00:15:57.069 } 00:15:57.069 ] 00:15:57.069 }' 00:15:57.069 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.069 17:09:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.636 17:09:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:57.894 [2024-07-23 17:09:53.217839] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:57.894 [2024-07-23 17:09:53.217878] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e88d40 name Existed_Raid, state configuring 00:15:57.894 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:15:58.152 [2024-07-23 17:09:53.470543] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:58.152 [2024-07-23 17:09:53.471995] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.152 [2024-07-23 17:09:53.472026] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.152 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.410 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.410 "name": "Existed_Raid", 00:15:58.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.410 "strip_size_kb": 64, 00:15:58.410 "state": "configuring", 00:15:58.410 "raid_level": "concat", 00:15:58.410 "superblock": false, 00:15:58.410 "num_base_bdevs": 2, 00:15:58.410 "num_base_bdevs_discovered": 1, 00:15:58.410 "num_base_bdevs_operational": 2, 00:15:58.410 "base_bdevs_list": [ 00:15:58.410 { 00:15:58.410 "name": "BaseBdev1", 00:15:58.410 "uuid": "093faacb-fbb3-43fe-8444-511b017fdb92", 00:15:58.410 "is_configured": true, 00:15:58.410 "data_offset": 0, 00:15:58.410 "data_size": 65536 00:15:58.410 }, 00:15:58.410 { 00:15:58.410 "name": "BaseBdev2", 00:15:58.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.410 "is_configured": false, 00:15:58.410 "data_offset": 0, 00:15:58.410 "data_size": 0 00:15:58.410 } 00:15:58.410 ] 00:15:58.410 }' 00:15:58.410 17:09:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.410 17:09:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:58.977 17:09:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:59.236 [2024-07-23 17:09:54.576853] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:59.236 [2024-07-23 17:09:54.576885] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e88990 00:15:59.236 [2024-07-23 17:09:54.576902] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:15:59.236 [2024-07-23 17:09:54.577147] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e8bed0 00:15:59.236 [2024-07-23 17:09:54.577257] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e88990 00:15:59.236 [2024-07-23 17:09:54.577267] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e88990 00:15:59.236 [2024-07-23 17:09:54.577421] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:59.236 BaseBdev2 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:59.236 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.494 17:09:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:59.752 [ 00:15:59.752 { 00:15:59.752 "name": "BaseBdev2", 00:15:59.752 "aliases": [ 00:15:59.752 "8668130a-1bad-41e7-86c3-02108a1189bc" 00:15:59.752 ], 00:15:59.752 "product_name": "Malloc disk", 00:15:59.752 "block_size": 512, 00:15:59.752 "num_blocks": 65536, 00:15:59.752 "uuid": "8668130a-1bad-41e7-86c3-02108a1189bc", 00:15:59.752 "assigned_rate_limits": { 00:15:59.752 "rw_ios_per_sec": 0, 00:15:59.752 "rw_mbytes_per_sec": 0, 00:15:59.752 "r_mbytes_per_sec": 0, 00:15:59.752 "w_mbytes_per_sec": 0 00:15:59.752 }, 00:15:59.752 "claimed": true, 00:15:59.752 "claim_type": "exclusive_write", 00:15:59.752 "zoned": false, 00:15:59.753 "supported_io_types": { 00:15:59.753 "read": true, 00:15:59.753 "write": true, 00:15:59.753 "unmap": true, 00:15:59.753 "flush": true, 00:15:59.753 "reset": true, 00:15:59.753 "nvme_admin": false, 00:15:59.753 "nvme_io": false, 00:15:59.753 "nvme_io_md": false, 00:15:59.753 "write_zeroes": true, 00:15:59.753 "zcopy": true, 00:15:59.753 "get_zone_info": false, 00:15:59.753 "zone_management": false, 00:15:59.753 "zone_append": false, 00:15:59.753 "compare": false, 00:15:59.753 "compare_and_write": false, 00:15:59.753 "abort": true, 00:15:59.753 "seek_hole": false, 00:15:59.753 "seek_data": false, 00:15:59.753 "copy": true, 00:15:59.753 "nvme_iov_md": false 00:15:59.753 }, 00:15:59.753 "memory_domains": [ 00:15:59.753 { 00:15:59.753 "dma_device_id": "system", 00:15:59.753 "dma_device_type": 1 00:15:59.753 }, 00:15:59.753 { 00:15:59.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.753 "dma_device_type": 2 00:15:59.753 } 00:15:59.753 ], 00:15:59.753 "driver_specific": {} 00:15:59.753 } 00:15:59.753 ] 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.753 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.011 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.011 "name": "Existed_Raid", 00:16:00.011 "uuid": "7000ff1c-5771-4c54-b383-cabadd6b56a9", 00:16:00.011 "strip_size_kb": 64, 00:16:00.011 "state": "online", 00:16:00.011 "raid_level": "concat", 00:16:00.011 "superblock": false, 00:16:00.011 "num_base_bdevs": 2, 00:16:00.011 "num_base_bdevs_discovered": 2, 00:16:00.011 "num_base_bdevs_operational": 2, 00:16:00.011 "base_bdevs_list": [ 00:16:00.011 { 00:16:00.011 "name": "BaseBdev1", 00:16:00.011 "uuid": "093faacb-fbb3-43fe-8444-511b017fdb92", 00:16:00.011 "is_configured": true, 00:16:00.011 "data_offset": 0, 00:16:00.011 "data_size": 65536 00:16:00.011 }, 00:16:00.011 { 00:16:00.011 "name": "BaseBdev2", 00:16:00.011 "uuid": "8668130a-1bad-41e7-86c3-02108a1189bc", 00:16:00.011 "is_configured": true, 00:16:00.011 "data_offset": 0, 00:16:00.011 "data_size": 65536 00:16:00.011 } 00:16:00.011 ] 00:16:00.011 }' 00:16:00.011 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.011 17:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:00.580 [2024-07-23 17:09:55.896625] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:00.580 "name": "Existed_Raid", 00:16:00.580 "aliases": [ 00:16:00.580 "7000ff1c-5771-4c54-b383-cabadd6b56a9" 00:16:00.580 ], 00:16:00.580 "product_name": "Raid Volume", 00:16:00.580 "block_size": 512, 00:16:00.580 "num_blocks": 131072, 00:16:00.580 "uuid": "7000ff1c-5771-4c54-b383-cabadd6b56a9", 00:16:00.580 "assigned_rate_limits": { 00:16:00.580 "rw_ios_per_sec": 0, 00:16:00.580 "rw_mbytes_per_sec": 0, 00:16:00.580 "r_mbytes_per_sec": 0, 00:16:00.580 "w_mbytes_per_sec": 0 00:16:00.580 }, 00:16:00.580 "claimed": false, 00:16:00.580 "zoned": false, 00:16:00.580 "supported_io_types": { 00:16:00.580 "read": true, 00:16:00.580 "write": true, 00:16:00.580 "unmap": true, 00:16:00.580 "flush": true, 00:16:00.580 "reset": true, 00:16:00.580 "nvme_admin": false, 00:16:00.580 "nvme_io": false, 00:16:00.580 "nvme_io_md": false, 00:16:00.580 "write_zeroes": true, 00:16:00.580 "zcopy": false, 00:16:00.580 "get_zone_info": false, 00:16:00.580 "zone_management": false, 00:16:00.580 "zone_append": false, 00:16:00.580 "compare": false, 00:16:00.580 "compare_and_write": false, 00:16:00.580 "abort": false, 00:16:00.580 "seek_hole": false, 00:16:00.580 "seek_data": false, 00:16:00.580 "copy": false, 00:16:00.580 "nvme_iov_md": false 00:16:00.580 }, 00:16:00.580 "memory_domains": [ 00:16:00.580 { 00:16:00.580 "dma_device_id": "system", 00:16:00.580 "dma_device_type": 1 00:16:00.580 }, 00:16:00.580 { 00:16:00.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.580 "dma_device_type": 2 00:16:00.580 }, 00:16:00.580 { 00:16:00.580 "dma_device_id": "system", 00:16:00.580 "dma_device_type": 1 00:16:00.580 }, 00:16:00.580 { 00:16:00.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.580 "dma_device_type": 2 00:16:00.580 } 00:16:00.580 ], 00:16:00.580 "driver_specific": { 00:16:00.580 "raid": { 00:16:00.580 "uuid": "7000ff1c-5771-4c54-b383-cabadd6b56a9", 00:16:00.580 "strip_size_kb": 64, 00:16:00.580 "state": "online", 00:16:00.580 "raid_level": "concat", 00:16:00.580 "superblock": false, 00:16:00.580 "num_base_bdevs": 2, 00:16:00.580 "num_base_bdevs_discovered": 2, 00:16:00.580 "num_base_bdevs_operational": 2, 00:16:00.580 "base_bdevs_list": [ 00:16:00.580 { 00:16:00.580 "name": "BaseBdev1", 00:16:00.580 "uuid": "093faacb-fbb3-43fe-8444-511b017fdb92", 00:16:00.580 "is_configured": true, 00:16:00.580 "data_offset": 0, 00:16:00.580 "data_size": 65536 00:16:00.580 }, 00:16:00.580 { 00:16:00.580 "name": "BaseBdev2", 00:16:00.580 "uuid": "8668130a-1bad-41e7-86c3-02108a1189bc", 00:16:00.580 "is_configured": true, 00:16:00.580 "data_offset": 0, 00:16:00.580 "data_size": 65536 00:16:00.580 } 00:16:00.580 ] 00:16:00.580 } 00:16:00.580 } 00:16:00.580 }' 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:00.580 BaseBdev2' 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:00.580 17:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.837 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.837 "name": "BaseBdev1", 00:16:00.837 "aliases": [ 00:16:00.837 "093faacb-fbb3-43fe-8444-511b017fdb92" 00:16:00.837 ], 00:16:00.837 "product_name": "Malloc disk", 00:16:00.837 "block_size": 512, 00:16:00.837 "num_blocks": 65536, 00:16:00.837 "uuid": "093faacb-fbb3-43fe-8444-511b017fdb92", 00:16:00.837 "assigned_rate_limits": { 00:16:00.837 "rw_ios_per_sec": 0, 00:16:00.837 "rw_mbytes_per_sec": 0, 00:16:00.837 "r_mbytes_per_sec": 0, 00:16:00.837 "w_mbytes_per_sec": 0 00:16:00.837 }, 00:16:00.837 "claimed": true, 00:16:00.837 "claim_type": "exclusive_write", 00:16:00.837 "zoned": false, 00:16:00.837 "supported_io_types": { 00:16:00.837 "read": true, 00:16:00.837 "write": true, 00:16:00.837 "unmap": true, 00:16:00.837 "flush": true, 00:16:00.837 "reset": true, 00:16:00.837 "nvme_admin": false, 00:16:00.837 "nvme_io": false, 00:16:00.837 "nvme_io_md": false, 00:16:00.837 "write_zeroes": true, 00:16:00.837 "zcopy": true, 00:16:00.837 "get_zone_info": false, 00:16:00.837 "zone_management": false, 00:16:00.837 "zone_append": false, 00:16:00.837 "compare": false, 00:16:00.837 "compare_and_write": false, 00:16:00.837 "abort": true, 00:16:00.837 "seek_hole": false, 00:16:00.837 "seek_data": false, 00:16:00.837 "copy": true, 00:16:00.837 "nvme_iov_md": false 00:16:00.837 }, 00:16:00.837 "memory_domains": [ 00:16:00.837 { 00:16:00.837 "dma_device_id": "system", 00:16:00.837 "dma_device_type": 1 00:16:00.837 }, 00:16:00.837 { 00:16:00.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.837 "dma_device_type": 2 00:16:00.837 } 00:16:00.837 ], 00:16:00.837 "driver_specific": {} 00:16:00.837 }' 00:16:00.838 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:01.095 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.353 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.353 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:01.353 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:01.353 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:01.353 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:01.612 "name": "BaseBdev2", 00:16:01.612 "aliases": [ 00:16:01.612 "8668130a-1bad-41e7-86c3-02108a1189bc" 00:16:01.612 ], 00:16:01.612 "product_name": "Malloc disk", 00:16:01.612 "block_size": 512, 00:16:01.612 "num_blocks": 65536, 00:16:01.612 "uuid": "8668130a-1bad-41e7-86c3-02108a1189bc", 00:16:01.612 "assigned_rate_limits": { 00:16:01.612 "rw_ios_per_sec": 0, 00:16:01.612 "rw_mbytes_per_sec": 0, 00:16:01.612 "r_mbytes_per_sec": 0, 00:16:01.612 "w_mbytes_per_sec": 0 00:16:01.612 }, 00:16:01.612 "claimed": true, 00:16:01.612 "claim_type": "exclusive_write", 00:16:01.612 "zoned": false, 00:16:01.612 "supported_io_types": { 00:16:01.612 "read": true, 00:16:01.612 "write": true, 00:16:01.612 "unmap": true, 00:16:01.612 "flush": true, 00:16:01.612 "reset": true, 00:16:01.612 "nvme_admin": false, 00:16:01.612 "nvme_io": false, 00:16:01.612 "nvme_io_md": false, 00:16:01.612 "write_zeroes": true, 00:16:01.612 "zcopy": true, 00:16:01.612 "get_zone_info": false, 00:16:01.612 "zone_management": false, 00:16:01.612 "zone_append": false, 00:16:01.612 "compare": false, 00:16:01.612 "compare_and_write": false, 00:16:01.612 "abort": true, 00:16:01.612 "seek_hole": false, 00:16:01.612 "seek_data": false, 00:16:01.612 "copy": true, 00:16:01.612 "nvme_iov_md": false 00:16:01.612 }, 00:16:01.612 "memory_domains": [ 00:16:01.612 { 00:16:01.612 "dma_device_id": "system", 00:16:01.612 "dma_device_type": 1 00:16:01.612 }, 00:16:01.612 { 00:16:01.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.612 "dma_device_type": 2 00:16:01.612 } 00:16:01.612 ], 00:16:01.612 "driver_specific": {} 00:16:01.612 }' 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:01.612 17:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.612 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.870 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:01.870 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.870 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.870 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:01.870 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:02.129 [2024-07-23 17:09:57.368337] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:02.129 [2024-07-23 17:09:57.368361] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:02.129 [2024-07-23 17:09:57.368400] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.129 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.387 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.387 "name": "Existed_Raid", 00:16:02.387 "uuid": "7000ff1c-5771-4c54-b383-cabadd6b56a9", 00:16:02.387 "strip_size_kb": 64, 00:16:02.387 "state": "offline", 00:16:02.387 "raid_level": "concat", 00:16:02.387 "superblock": false, 00:16:02.388 "num_base_bdevs": 2, 00:16:02.388 "num_base_bdevs_discovered": 1, 00:16:02.388 "num_base_bdevs_operational": 1, 00:16:02.388 "base_bdevs_list": [ 00:16:02.388 { 00:16:02.388 "name": null, 00:16:02.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.388 "is_configured": false, 00:16:02.388 "data_offset": 0, 00:16:02.388 "data_size": 65536 00:16:02.388 }, 00:16:02.388 { 00:16:02.388 "name": "BaseBdev2", 00:16:02.388 "uuid": "8668130a-1bad-41e7-86c3-02108a1189bc", 00:16:02.388 "is_configured": true, 00:16:02.388 "data_offset": 0, 00:16:02.388 "data_size": 65536 00:16:02.388 } 00:16:02.388 ] 00:16:02.388 }' 00:16:02.388 17:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.388 17:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.954 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:02.954 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:02.954 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.954 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:03.211 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:03.211 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:03.211 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:03.469 [2024-07-23 17:09:58.712954] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:03.469 [2024-07-23 17:09:58.713000] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e88990 name Existed_Raid, state offline 00:16:03.469 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:03.469 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:03.469 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:03.469 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4114059 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4114059 ']' 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4114059 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4114059 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4114059' 00:16:03.728 killing process with pid 4114059 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4114059 00:16:03.728 [2024-07-23 17:09:58.993903] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:03.728 17:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4114059 00:16:03.728 [2024-07-23 17:09:58.994778] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:03.988 00:16:03.988 real 0m10.980s 00:16:03.988 user 0m19.474s 00:16:03.988 sys 0m2.107s 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.988 ************************************ 00:16:03.988 END TEST raid_state_function_test 00:16:03.988 ************************************ 00:16:03.988 17:09:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:03.988 17:09:59 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:16:03.988 17:09:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:03.988 17:09:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:03.988 17:09:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:03.988 ************************************ 00:16:03.988 START TEST raid_state_function_test_sb 00:16:03.988 ************************************ 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4115698 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4115698' 00:16:03.988 Process raid pid: 4115698 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4115698 /var/tmp/spdk-raid.sock 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4115698 ']' 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:03.988 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:03.988 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.988 [2024-07-23 17:09:59.359099] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:16:03.988 [2024-07-23 17:09:59.359166] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:04.247 [2024-07-23 17:09:59.483365] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:04.247 [2024-07-23 17:09:59.535742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:04.247 [2024-07-23 17:09:59.601415] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:04.247 [2024-07-23 17:09:59.601454] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:04.247 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:04.247 17:09:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:04.247 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:04.507 [2024-07-23 17:09:59.880819] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:04.507 [2024-07-23 17:09:59.880856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:04.507 [2024-07-23 17:09:59.880867] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:04.507 [2024-07-23 17:09:59.880879] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.507 17:09:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.765 17:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.765 "name": "Existed_Raid", 00:16:04.765 "uuid": "fa70d89d-fbde-4f27-a2b4-52339b6eb0c6", 00:16:04.765 "strip_size_kb": 64, 00:16:04.765 "state": "configuring", 00:16:04.765 "raid_level": "concat", 00:16:04.765 "superblock": true, 00:16:04.765 "num_base_bdevs": 2, 00:16:04.765 "num_base_bdevs_discovered": 0, 00:16:04.765 "num_base_bdevs_operational": 2, 00:16:04.765 "base_bdevs_list": [ 00:16:04.765 { 00:16:04.765 "name": "BaseBdev1", 00:16:04.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.765 "is_configured": false, 00:16:04.765 "data_offset": 0, 00:16:04.765 "data_size": 0 00:16:04.765 }, 00:16:04.765 { 00:16:04.765 "name": "BaseBdev2", 00:16:04.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.765 "is_configured": false, 00:16:04.765 "data_offset": 0, 00:16:04.765 "data_size": 0 00:16:04.765 } 00:16:04.765 ] 00:16:04.765 }' 00:16:04.765 17:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.765 17:10:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.700 17:10:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:05.700 [2024-07-23 17:10:00.983607] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:05.700 [2024-07-23 17:10:00.983637] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1491410 name Existed_Raid, state configuring 00:16:05.700 17:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:05.959 [2024-07-23 17:10:01.232286] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:05.959 [2024-07-23 17:10:01.232315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:05.959 [2024-07-23 17:10:01.232325] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:05.959 [2024-07-23 17:10:01.232336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:05.959 17:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:06.217 [2024-07-23 17:10:01.490788] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:06.217 BaseBdev1 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:06.217 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:06.476 17:10:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:06.734 [ 00:16:06.734 { 00:16:06.734 "name": "BaseBdev1", 00:16:06.734 "aliases": [ 00:16:06.734 "7c507e0d-4055-4c92-bbd5-fc6475b8828f" 00:16:06.734 ], 00:16:06.734 "product_name": "Malloc disk", 00:16:06.734 "block_size": 512, 00:16:06.734 "num_blocks": 65536, 00:16:06.734 "uuid": "7c507e0d-4055-4c92-bbd5-fc6475b8828f", 00:16:06.734 "assigned_rate_limits": { 00:16:06.734 "rw_ios_per_sec": 0, 00:16:06.734 "rw_mbytes_per_sec": 0, 00:16:06.734 "r_mbytes_per_sec": 0, 00:16:06.734 "w_mbytes_per_sec": 0 00:16:06.734 }, 00:16:06.734 "claimed": true, 00:16:06.734 "claim_type": "exclusive_write", 00:16:06.734 "zoned": false, 00:16:06.734 "supported_io_types": { 00:16:06.734 "read": true, 00:16:06.734 "write": true, 00:16:06.734 "unmap": true, 00:16:06.734 "flush": true, 00:16:06.734 "reset": true, 00:16:06.734 "nvme_admin": false, 00:16:06.734 "nvme_io": false, 00:16:06.734 "nvme_io_md": false, 00:16:06.734 "write_zeroes": true, 00:16:06.734 "zcopy": true, 00:16:06.734 "get_zone_info": false, 00:16:06.734 "zone_management": false, 00:16:06.734 "zone_append": false, 00:16:06.734 "compare": false, 00:16:06.734 "compare_and_write": false, 00:16:06.734 "abort": true, 00:16:06.734 "seek_hole": false, 00:16:06.734 "seek_data": false, 00:16:06.734 "copy": true, 00:16:06.734 "nvme_iov_md": false 00:16:06.734 }, 00:16:06.734 "memory_domains": [ 00:16:06.734 { 00:16:06.734 "dma_device_id": "system", 00:16:06.734 "dma_device_type": 1 00:16:06.734 }, 00:16:06.734 { 00:16:06.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.734 "dma_device_type": 2 00:16:06.734 } 00:16:06.735 ], 00:16:06.735 "driver_specific": {} 00:16:06.735 } 00:16:06.735 ] 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.735 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.993 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.993 "name": "Existed_Raid", 00:16:06.993 "uuid": "8dc5f707-4190-4eb6-9916-ca4a71750f5b", 00:16:06.993 "strip_size_kb": 64, 00:16:06.993 "state": "configuring", 00:16:06.993 "raid_level": "concat", 00:16:06.993 "superblock": true, 00:16:06.993 "num_base_bdevs": 2, 00:16:06.993 "num_base_bdevs_discovered": 1, 00:16:06.993 "num_base_bdevs_operational": 2, 00:16:06.993 "base_bdevs_list": [ 00:16:06.993 { 00:16:06.993 "name": "BaseBdev1", 00:16:06.993 "uuid": "7c507e0d-4055-4c92-bbd5-fc6475b8828f", 00:16:06.993 "is_configured": true, 00:16:06.993 "data_offset": 2048, 00:16:06.993 "data_size": 63488 00:16:06.993 }, 00:16:06.993 { 00:16:06.993 "name": "BaseBdev2", 00:16:06.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.993 "is_configured": false, 00:16:06.993 "data_offset": 0, 00:16:06.993 "data_size": 0 00:16:06.993 } 00:16:06.993 ] 00:16:06.993 }' 00:16:06.993 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.994 17:10:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.562 17:10:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:07.821 [2024-07-23 17:10:03.211528] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:07.821 [2024-07-23 17:10:03.211571] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1490d40 name Existed_Raid, state configuring 00:16:07.821 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:08.079 [2024-07-23 17:10:03.460233] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:08.079 [2024-07-23 17:10:03.461733] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:08.079 [2024-07-23 17:10:03.461767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:08.079 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:08.079 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:08.079 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:16:08.079 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.080 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.338 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.338 "name": "Existed_Raid", 00:16:08.338 "uuid": "40c375f5-84d9-41b5-b025-0c80da5ff2c6", 00:16:08.338 "strip_size_kb": 64, 00:16:08.338 "state": "configuring", 00:16:08.338 "raid_level": "concat", 00:16:08.338 "superblock": true, 00:16:08.338 "num_base_bdevs": 2, 00:16:08.338 "num_base_bdevs_discovered": 1, 00:16:08.338 "num_base_bdevs_operational": 2, 00:16:08.338 "base_bdevs_list": [ 00:16:08.338 { 00:16:08.338 "name": "BaseBdev1", 00:16:08.338 "uuid": "7c507e0d-4055-4c92-bbd5-fc6475b8828f", 00:16:08.338 "is_configured": true, 00:16:08.338 "data_offset": 2048, 00:16:08.338 "data_size": 63488 00:16:08.338 }, 00:16:08.338 { 00:16:08.338 "name": "BaseBdev2", 00:16:08.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.338 "is_configured": false, 00:16:08.338 "data_offset": 0, 00:16:08.338 "data_size": 0 00:16:08.338 } 00:16:08.338 ] 00:16:08.338 }' 00:16:08.338 17:10:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.338 17:10:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:09.275 [2024-07-23 17:10:04.522469] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:09.275 [2024-07-23 17:10:04.522616] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1490990 00:16:09.275 [2024-07-23 17:10:04.522634] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:16:09.275 [2024-07-23 17:10:04.522806] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1494a60 00:16:09.275 [2024-07-23 17:10:04.522937] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1490990 00:16:09.275 [2024-07-23 17:10:04.522948] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1490990 00:16:09.275 [2024-07-23 17:10:04.523043] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:09.275 BaseBdev2 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:09.275 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:09.534 17:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:09.793 [ 00:16:09.793 { 00:16:09.793 "name": "BaseBdev2", 00:16:09.793 "aliases": [ 00:16:09.793 "64356cc6-0fcd-4932-8d4a-aad63341072f" 00:16:09.793 ], 00:16:09.793 "product_name": "Malloc disk", 00:16:09.793 "block_size": 512, 00:16:09.793 "num_blocks": 65536, 00:16:09.793 "uuid": "64356cc6-0fcd-4932-8d4a-aad63341072f", 00:16:09.793 "assigned_rate_limits": { 00:16:09.793 "rw_ios_per_sec": 0, 00:16:09.793 "rw_mbytes_per_sec": 0, 00:16:09.793 "r_mbytes_per_sec": 0, 00:16:09.793 "w_mbytes_per_sec": 0 00:16:09.793 }, 00:16:09.794 "claimed": true, 00:16:09.794 "claim_type": "exclusive_write", 00:16:09.794 "zoned": false, 00:16:09.794 "supported_io_types": { 00:16:09.794 "read": true, 00:16:09.794 "write": true, 00:16:09.794 "unmap": true, 00:16:09.794 "flush": true, 00:16:09.794 "reset": true, 00:16:09.794 "nvme_admin": false, 00:16:09.794 "nvme_io": false, 00:16:09.794 "nvme_io_md": false, 00:16:09.794 "write_zeroes": true, 00:16:09.794 "zcopy": true, 00:16:09.794 "get_zone_info": false, 00:16:09.794 "zone_management": false, 00:16:09.794 "zone_append": false, 00:16:09.794 "compare": false, 00:16:09.794 "compare_and_write": false, 00:16:09.794 "abort": true, 00:16:09.794 "seek_hole": false, 00:16:09.794 "seek_data": false, 00:16:09.794 "copy": true, 00:16:09.794 "nvme_iov_md": false 00:16:09.794 }, 00:16:09.794 "memory_domains": [ 00:16:09.794 { 00:16:09.794 "dma_device_id": "system", 00:16:09.794 "dma_device_type": 1 00:16:09.794 }, 00:16:09.794 { 00:16:09.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.794 "dma_device_type": 2 00:16:09.794 } 00:16:09.794 ], 00:16:09.794 "driver_specific": {} 00:16:09.794 } 00:16:09.794 ] 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.794 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.053 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.053 "name": "Existed_Raid", 00:16:10.053 "uuid": "40c375f5-84d9-41b5-b025-0c80da5ff2c6", 00:16:10.053 "strip_size_kb": 64, 00:16:10.053 "state": "online", 00:16:10.053 "raid_level": "concat", 00:16:10.053 "superblock": true, 00:16:10.053 "num_base_bdevs": 2, 00:16:10.053 "num_base_bdevs_discovered": 2, 00:16:10.053 "num_base_bdevs_operational": 2, 00:16:10.053 "base_bdevs_list": [ 00:16:10.053 { 00:16:10.053 "name": "BaseBdev1", 00:16:10.053 "uuid": "7c507e0d-4055-4c92-bbd5-fc6475b8828f", 00:16:10.053 "is_configured": true, 00:16:10.053 "data_offset": 2048, 00:16:10.053 "data_size": 63488 00:16:10.053 }, 00:16:10.053 { 00:16:10.053 "name": "BaseBdev2", 00:16:10.053 "uuid": "64356cc6-0fcd-4932-8d4a-aad63341072f", 00:16:10.053 "is_configured": true, 00:16:10.053 "data_offset": 2048, 00:16:10.053 "data_size": 63488 00:16:10.053 } 00:16:10.053 ] 00:16:10.053 }' 00:16:10.053 17:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.053 17:10:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:11.022 [2024-07-23 17:10:06.367653] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:11.022 "name": "Existed_Raid", 00:16:11.022 "aliases": [ 00:16:11.022 "40c375f5-84d9-41b5-b025-0c80da5ff2c6" 00:16:11.022 ], 00:16:11.022 "product_name": "Raid Volume", 00:16:11.022 "block_size": 512, 00:16:11.022 "num_blocks": 126976, 00:16:11.022 "uuid": "40c375f5-84d9-41b5-b025-0c80da5ff2c6", 00:16:11.022 "assigned_rate_limits": { 00:16:11.022 "rw_ios_per_sec": 0, 00:16:11.022 "rw_mbytes_per_sec": 0, 00:16:11.022 "r_mbytes_per_sec": 0, 00:16:11.022 "w_mbytes_per_sec": 0 00:16:11.022 }, 00:16:11.022 "claimed": false, 00:16:11.022 "zoned": false, 00:16:11.022 "supported_io_types": { 00:16:11.022 "read": true, 00:16:11.022 "write": true, 00:16:11.022 "unmap": true, 00:16:11.022 "flush": true, 00:16:11.022 "reset": true, 00:16:11.022 "nvme_admin": false, 00:16:11.022 "nvme_io": false, 00:16:11.022 "nvme_io_md": false, 00:16:11.022 "write_zeroes": true, 00:16:11.022 "zcopy": false, 00:16:11.022 "get_zone_info": false, 00:16:11.022 "zone_management": false, 00:16:11.022 "zone_append": false, 00:16:11.022 "compare": false, 00:16:11.022 "compare_and_write": false, 00:16:11.022 "abort": false, 00:16:11.022 "seek_hole": false, 00:16:11.022 "seek_data": false, 00:16:11.022 "copy": false, 00:16:11.022 "nvme_iov_md": false 00:16:11.022 }, 00:16:11.022 "memory_domains": [ 00:16:11.022 { 00:16:11.022 "dma_device_id": "system", 00:16:11.022 "dma_device_type": 1 00:16:11.022 }, 00:16:11.022 { 00:16:11.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.022 "dma_device_type": 2 00:16:11.022 }, 00:16:11.022 { 00:16:11.022 "dma_device_id": "system", 00:16:11.022 "dma_device_type": 1 00:16:11.022 }, 00:16:11.022 { 00:16:11.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.022 "dma_device_type": 2 00:16:11.022 } 00:16:11.022 ], 00:16:11.022 "driver_specific": { 00:16:11.022 "raid": { 00:16:11.022 "uuid": "40c375f5-84d9-41b5-b025-0c80da5ff2c6", 00:16:11.022 "strip_size_kb": 64, 00:16:11.022 "state": "online", 00:16:11.022 "raid_level": "concat", 00:16:11.022 "superblock": true, 00:16:11.022 "num_base_bdevs": 2, 00:16:11.022 "num_base_bdevs_discovered": 2, 00:16:11.022 "num_base_bdevs_operational": 2, 00:16:11.022 "base_bdevs_list": [ 00:16:11.022 { 00:16:11.022 "name": "BaseBdev1", 00:16:11.022 "uuid": "7c507e0d-4055-4c92-bbd5-fc6475b8828f", 00:16:11.022 "is_configured": true, 00:16:11.022 "data_offset": 2048, 00:16:11.022 "data_size": 63488 00:16:11.022 }, 00:16:11.022 { 00:16:11.022 "name": "BaseBdev2", 00:16:11.022 "uuid": "64356cc6-0fcd-4932-8d4a-aad63341072f", 00:16:11.022 "is_configured": true, 00:16:11.022 "data_offset": 2048, 00:16:11.022 "data_size": 63488 00:16:11.022 } 00:16:11.022 ] 00:16:11.022 } 00:16:11.022 } 00:16:11.022 }' 00:16:11.022 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:11.281 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:11.281 BaseBdev2' 00:16:11.281 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.281 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:11.281 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.850 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.850 "name": "BaseBdev1", 00:16:11.850 "aliases": [ 00:16:11.850 "7c507e0d-4055-4c92-bbd5-fc6475b8828f" 00:16:11.850 ], 00:16:11.850 "product_name": "Malloc disk", 00:16:11.850 "block_size": 512, 00:16:11.850 "num_blocks": 65536, 00:16:11.850 "uuid": "7c507e0d-4055-4c92-bbd5-fc6475b8828f", 00:16:11.850 "assigned_rate_limits": { 00:16:11.850 "rw_ios_per_sec": 0, 00:16:11.850 "rw_mbytes_per_sec": 0, 00:16:11.850 "r_mbytes_per_sec": 0, 00:16:11.850 "w_mbytes_per_sec": 0 00:16:11.850 }, 00:16:11.850 "claimed": true, 00:16:11.850 "claim_type": "exclusive_write", 00:16:11.850 "zoned": false, 00:16:11.850 "supported_io_types": { 00:16:11.850 "read": true, 00:16:11.850 "write": true, 00:16:11.850 "unmap": true, 00:16:11.850 "flush": true, 00:16:11.850 "reset": true, 00:16:11.850 "nvme_admin": false, 00:16:11.850 "nvme_io": false, 00:16:11.850 "nvme_io_md": false, 00:16:11.850 "write_zeroes": true, 00:16:11.850 "zcopy": true, 00:16:11.850 "get_zone_info": false, 00:16:11.850 "zone_management": false, 00:16:11.850 "zone_append": false, 00:16:11.850 "compare": false, 00:16:11.850 "compare_and_write": false, 00:16:11.850 "abort": true, 00:16:11.850 "seek_hole": false, 00:16:11.850 "seek_data": false, 00:16:11.850 "copy": true, 00:16:11.850 "nvme_iov_md": false 00:16:11.850 }, 00:16:11.850 "memory_domains": [ 00:16:11.850 { 00:16:11.850 "dma_device_id": "system", 00:16:11.850 "dma_device_type": 1 00:16:11.850 }, 00:16:11.850 { 00:16:11.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.850 "dma_device_type": 2 00:16:11.850 } 00:16:11.850 ], 00:16:11.850 "driver_specific": {} 00:16:11.850 }' 00:16:11.850 17:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.850 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.109 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.109 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.109 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:12.109 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.109 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:12.368 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.368 "name": "BaseBdev2", 00:16:12.368 "aliases": [ 00:16:12.368 "64356cc6-0fcd-4932-8d4a-aad63341072f" 00:16:12.368 ], 00:16:12.368 "product_name": "Malloc disk", 00:16:12.368 "block_size": 512, 00:16:12.368 "num_blocks": 65536, 00:16:12.368 "uuid": "64356cc6-0fcd-4932-8d4a-aad63341072f", 00:16:12.368 "assigned_rate_limits": { 00:16:12.368 "rw_ios_per_sec": 0, 00:16:12.368 "rw_mbytes_per_sec": 0, 00:16:12.368 "r_mbytes_per_sec": 0, 00:16:12.368 "w_mbytes_per_sec": 0 00:16:12.368 }, 00:16:12.369 "claimed": true, 00:16:12.369 "claim_type": "exclusive_write", 00:16:12.369 "zoned": false, 00:16:12.369 "supported_io_types": { 00:16:12.369 "read": true, 00:16:12.369 "write": true, 00:16:12.369 "unmap": true, 00:16:12.369 "flush": true, 00:16:12.369 "reset": true, 00:16:12.369 "nvme_admin": false, 00:16:12.369 "nvme_io": false, 00:16:12.369 "nvme_io_md": false, 00:16:12.369 "write_zeroes": true, 00:16:12.369 "zcopy": true, 00:16:12.369 "get_zone_info": false, 00:16:12.369 "zone_management": false, 00:16:12.369 "zone_append": false, 00:16:12.369 "compare": false, 00:16:12.369 "compare_and_write": false, 00:16:12.369 "abort": true, 00:16:12.369 "seek_hole": false, 00:16:12.369 "seek_data": false, 00:16:12.369 "copy": true, 00:16:12.369 "nvme_iov_md": false 00:16:12.369 }, 00:16:12.369 "memory_domains": [ 00:16:12.369 { 00:16:12.369 "dma_device_id": "system", 00:16:12.369 "dma_device_type": 1 00:16:12.369 }, 00:16:12.369 { 00:16:12.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.369 "dma_device_type": 2 00:16:12.369 } 00:16:12.369 ], 00:16:12.369 "driver_specific": {} 00:16:12.369 }' 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.369 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.628 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.628 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.628 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.628 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.628 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.628 17:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:12.887 [2024-07-23 17:10:08.192290] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:12.887 [2024-07-23 17:10:08.192319] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.887 [2024-07-23 17:10:08.192361] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.887 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.147 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.147 "name": "Existed_Raid", 00:16:13.147 "uuid": "40c375f5-84d9-41b5-b025-0c80da5ff2c6", 00:16:13.147 "strip_size_kb": 64, 00:16:13.147 "state": "offline", 00:16:13.147 "raid_level": "concat", 00:16:13.147 "superblock": true, 00:16:13.147 "num_base_bdevs": 2, 00:16:13.147 "num_base_bdevs_discovered": 1, 00:16:13.147 "num_base_bdevs_operational": 1, 00:16:13.147 "base_bdevs_list": [ 00:16:13.147 { 00:16:13.147 "name": null, 00:16:13.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.147 "is_configured": false, 00:16:13.147 "data_offset": 2048, 00:16:13.147 "data_size": 63488 00:16:13.147 }, 00:16:13.147 { 00:16:13.147 "name": "BaseBdev2", 00:16:13.147 "uuid": "64356cc6-0fcd-4932-8d4a-aad63341072f", 00:16:13.147 "is_configured": true, 00:16:13.147 "data_offset": 2048, 00:16:13.147 "data_size": 63488 00:16:13.147 } 00:16:13.147 ] 00:16:13.147 }' 00:16:13.147 17:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.147 17:10:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.715 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:13.715 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:13.715 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.715 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:13.974 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:13.974 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:13.974 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:14.542 [2024-07-23 17:10:09.834552] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:14.543 [2024-07-23 17:10:09.834609] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1490990 name Existed_Raid, state offline 00:16:14.543 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:14.543 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:14.543 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.543 17:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4115698 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4115698 ']' 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4115698 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4115698 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4115698' 00:16:14.801 killing process with pid 4115698 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4115698 00:16:14.801 [2024-07-23 17:10:10.169458] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:14.801 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4115698 00:16:14.801 [2024-07-23 17:10:10.170400] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:15.060 17:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:15.060 00:16:15.060 real 0m11.090s 00:16:15.060 user 0m20.168s 00:16:15.060 sys 0m2.130s 00:16:15.060 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:15.060 17:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:15.060 ************************************ 00:16:15.060 END TEST raid_state_function_test_sb 00:16:15.060 ************************************ 00:16:15.060 17:10:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:15.060 17:10:10 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:16:15.060 17:10:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:15.060 17:10:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:15.060 17:10:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:15.060 ************************************ 00:16:15.060 START TEST raid_superblock_test 00:16:15.060 ************************************ 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4117339 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4117339 /var/tmp/spdk-raid.sock 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4117339 ']' 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:15.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:15.060 17:10:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.319 [2024-07-23 17:10:10.528839] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:16:15.319 [2024-07-23 17:10:10.528920] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4117339 ] 00:16:15.319 [2024-07-23 17:10:10.659475] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:15.319 [2024-07-23 17:10:10.710912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:15.578 [2024-07-23 17:10:10.772770] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:15.578 [2024-07-23 17:10:10.772813] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:16.514 17:10:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:17.082 malloc1 00:16:17.082 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:17.340 [2024-07-23 17:10:12.745173] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:17.340 [2024-07-23 17:10:12.745223] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.340 [2024-07-23 17:10:12.745245] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1377070 00:16:17.340 [2024-07-23 17:10:12.745258] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.340 [2024-07-23 17:10:12.746924] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.340 [2024-07-23 17:10:12.746953] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:17.340 pt1 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:17.599 17:10:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:17.857 malloc2 00:16:18.115 17:10:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:18.372 [2024-07-23 17:10:13.777031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:18.372 [2024-07-23 17:10:13.777082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.372 [2024-07-23 17:10:13.777101] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125d920 00:16:18.372 [2024-07-23 17:10:13.777113] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.372 [2024-07-23 17:10:13.778862] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.372 [2024-07-23 17:10:13.778891] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:18.372 pt2 00:16:18.630 17:10:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:18.630 17:10:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:18.630 17:10:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:16:18.889 [2024-07-23 17:10:14.290398] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:18.889 [2024-07-23 17:10:14.291775] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:18.889 [2024-07-23 17:10:14.291931] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x136f3e0 00:16:18.889 [2024-07-23 17:10:14.291946] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:16:18.889 [2024-07-23 17:10:14.292146] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1370280 00:16:18.889 [2024-07-23 17:10:14.292285] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x136f3e0 00:16:18.889 [2024-07-23 17:10:14.292295] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x136f3e0 00:16:18.889 [2024-07-23 17:10:14.292391] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.148 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:19.407 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.407 "name": "raid_bdev1", 00:16:19.407 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:19.407 "strip_size_kb": 64, 00:16:19.407 "state": "online", 00:16:19.407 "raid_level": "concat", 00:16:19.407 "superblock": true, 00:16:19.407 "num_base_bdevs": 2, 00:16:19.407 "num_base_bdevs_discovered": 2, 00:16:19.407 "num_base_bdevs_operational": 2, 00:16:19.407 "base_bdevs_list": [ 00:16:19.407 { 00:16:19.407 "name": "pt1", 00:16:19.407 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:19.407 "is_configured": true, 00:16:19.407 "data_offset": 2048, 00:16:19.407 "data_size": 63488 00:16:19.407 }, 00:16:19.407 { 00:16:19.407 "name": "pt2", 00:16:19.407 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:19.407 "is_configured": true, 00:16:19.407 "data_offset": 2048, 00:16:19.407 "data_size": 63488 00:16:19.407 } 00:16:19.407 ] 00:16:19.407 }' 00:16:19.407 17:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.407 17:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:20.343 [2024-07-23 17:10:15.670274] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:20.343 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:20.343 "name": "raid_bdev1", 00:16:20.343 "aliases": [ 00:16:20.343 "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad" 00:16:20.343 ], 00:16:20.343 "product_name": "Raid Volume", 00:16:20.343 "block_size": 512, 00:16:20.343 "num_blocks": 126976, 00:16:20.343 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:20.343 "assigned_rate_limits": { 00:16:20.343 "rw_ios_per_sec": 0, 00:16:20.343 "rw_mbytes_per_sec": 0, 00:16:20.343 "r_mbytes_per_sec": 0, 00:16:20.343 "w_mbytes_per_sec": 0 00:16:20.343 }, 00:16:20.343 "claimed": false, 00:16:20.343 "zoned": false, 00:16:20.343 "supported_io_types": { 00:16:20.343 "read": true, 00:16:20.343 "write": true, 00:16:20.343 "unmap": true, 00:16:20.343 "flush": true, 00:16:20.343 "reset": true, 00:16:20.343 "nvme_admin": false, 00:16:20.343 "nvme_io": false, 00:16:20.343 "nvme_io_md": false, 00:16:20.343 "write_zeroes": true, 00:16:20.343 "zcopy": false, 00:16:20.343 "get_zone_info": false, 00:16:20.343 "zone_management": false, 00:16:20.343 "zone_append": false, 00:16:20.343 "compare": false, 00:16:20.343 "compare_and_write": false, 00:16:20.343 "abort": false, 00:16:20.343 "seek_hole": false, 00:16:20.343 "seek_data": false, 00:16:20.343 "copy": false, 00:16:20.343 "nvme_iov_md": false 00:16:20.343 }, 00:16:20.343 "memory_domains": [ 00:16:20.343 { 00:16:20.343 "dma_device_id": "system", 00:16:20.343 "dma_device_type": 1 00:16:20.343 }, 00:16:20.343 { 00:16:20.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.343 "dma_device_type": 2 00:16:20.343 }, 00:16:20.343 { 00:16:20.343 "dma_device_id": "system", 00:16:20.343 "dma_device_type": 1 00:16:20.343 }, 00:16:20.343 { 00:16:20.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.343 "dma_device_type": 2 00:16:20.343 } 00:16:20.343 ], 00:16:20.343 "driver_specific": { 00:16:20.343 "raid": { 00:16:20.343 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:20.343 "strip_size_kb": 64, 00:16:20.343 "state": "online", 00:16:20.343 "raid_level": "concat", 00:16:20.343 "superblock": true, 00:16:20.343 "num_base_bdevs": 2, 00:16:20.343 "num_base_bdevs_discovered": 2, 00:16:20.343 "num_base_bdevs_operational": 2, 00:16:20.343 "base_bdevs_list": [ 00:16:20.343 { 00:16:20.343 "name": "pt1", 00:16:20.343 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:20.343 "is_configured": true, 00:16:20.344 "data_offset": 2048, 00:16:20.344 "data_size": 63488 00:16:20.344 }, 00:16:20.344 { 00:16:20.344 "name": "pt2", 00:16:20.344 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:20.344 "is_configured": true, 00:16:20.344 "data_offset": 2048, 00:16:20.344 "data_size": 63488 00:16:20.344 } 00:16:20.344 ] 00:16:20.344 } 00:16:20.344 } 00:16:20.344 }' 00:16:20.344 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:20.603 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:20.603 pt2' 00:16:20.603 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:20.603 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:20.603 17:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.171 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.171 "name": "pt1", 00:16:21.171 "aliases": [ 00:16:21.171 "00000000-0000-0000-0000-000000000001" 00:16:21.171 ], 00:16:21.171 "product_name": "passthru", 00:16:21.171 "block_size": 512, 00:16:21.171 "num_blocks": 65536, 00:16:21.171 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:21.171 "assigned_rate_limits": { 00:16:21.171 "rw_ios_per_sec": 0, 00:16:21.171 "rw_mbytes_per_sec": 0, 00:16:21.171 "r_mbytes_per_sec": 0, 00:16:21.171 "w_mbytes_per_sec": 0 00:16:21.171 }, 00:16:21.171 "claimed": true, 00:16:21.171 "claim_type": "exclusive_write", 00:16:21.171 "zoned": false, 00:16:21.171 "supported_io_types": { 00:16:21.171 "read": true, 00:16:21.171 "write": true, 00:16:21.171 "unmap": true, 00:16:21.171 "flush": true, 00:16:21.171 "reset": true, 00:16:21.171 "nvme_admin": false, 00:16:21.171 "nvme_io": false, 00:16:21.171 "nvme_io_md": false, 00:16:21.171 "write_zeroes": true, 00:16:21.171 "zcopy": true, 00:16:21.171 "get_zone_info": false, 00:16:21.171 "zone_management": false, 00:16:21.171 "zone_append": false, 00:16:21.171 "compare": false, 00:16:21.171 "compare_and_write": false, 00:16:21.171 "abort": true, 00:16:21.171 "seek_hole": false, 00:16:21.171 "seek_data": false, 00:16:21.171 "copy": true, 00:16:21.171 "nvme_iov_md": false 00:16:21.171 }, 00:16:21.171 "memory_domains": [ 00:16:21.171 { 00:16:21.171 "dma_device_id": "system", 00:16:21.171 "dma_device_type": 1 00:16:21.171 }, 00:16:21.171 { 00:16:21.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.171 "dma_device_type": 2 00:16:21.171 } 00:16:21.171 ], 00:16:21.171 "driver_specific": { 00:16:21.171 "passthru": { 00:16:21.171 "name": "pt1", 00:16:21.171 "base_bdev_name": "malloc1" 00:16:21.171 } 00:16:21.171 } 00:16:21.171 }' 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.172 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.431 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.431 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.431 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:21.431 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:21.431 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.690 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.690 "name": "pt2", 00:16:21.690 "aliases": [ 00:16:21.690 "00000000-0000-0000-0000-000000000002" 00:16:21.690 ], 00:16:21.690 "product_name": "passthru", 00:16:21.690 "block_size": 512, 00:16:21.690 "num_blocks": 65536, 00:16:21.690 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:21.690 "assigned_rate_limits": { 00:16:21.690 "rw_ios_per_sec": 0, 00:16:21.690 "rw_mbytes_per_sec": 0, 00:16:21.690 "r_mbytes_per_sec": 0, 00:16:21.690 "w_mbytes_per_sec": 0 00:16:21.690 }, 00:16:21.690 "claimed": true, 00:16:21.690 "claim_type": "exclusive_write", 00:16:21.690 "zoned": false, 00:16:21.690 "supported_io_types": { 00:16:21.690 "read": true, 00:16:21.690 "write": true, 00:16:21.690 "unmap": true, 00:16:21.690 "flush": true, 00:16:21.690 "reset": true, 00:16:21.690 "nvme_admin": false, 00:16:21.690 "nvme_io": false, 00:16:21.690 "nvme_io_md": false, 00:16:21.690 "write_zeroes": true, 00:16:21.690 "zcopy": true, 00:16:21.690 "get_zone_info": false, 00:16:21.690 "zone_management": false, 00:16:21.690 "zone_append": false, 00:16:21.690 "compare": false, 00:16:21.690 "compare_and_write": false, 00:16:21.690 "abort": true, 00:16:21.690 "seek_hole": false, 00:16:21.690 "seek_data": false, 00:16:21.690 "copy": true, 00:16:21.690 "nvme_iov_md": false 00:16:21.690 }, 00:16:21.690 "memory_domains": [ 00:16:21.690 { 00:16:21.690 "dma_device_id": "system", 00:16:21.690 "dma_device_type": 1 00:16:21.690 }, 00:16:21.690 { 00:16:21.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.690 "dma_device_type": 2 00:16:21.690 } 00:16:21.690 ], 00:16:21.690 "driver_specific": { 00:16:21.690 "passthru": { 00:16:21.690 "name": "pt2", 00:16:21.690 "base_bdev_name": "malloc2" 00:16:21.690 } 00:16:21.690 } 00:16:21.690 }' 00:16:21.690 17:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.690 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.690 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:21.690 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.690 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:21.950 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:22.209 [2024-07-23 17:10:17.519185] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:22.209 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b35343ff-8dc8-46c8-b8c3-8c3b5c032aad 00:16:22.209 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b35343ff-8dc8-46c8-b8c3-8c3b5c032aad ']' 00:16:22.209 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:22.468 [2024-07-23 17:10:17.763604] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:22.468 [2024-07-23 17:10:17.763628] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:22.468 [2024-07-23 17:10:17.763686] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:22.469 [2024-07-23 17:10:17.763729] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:22.469 [2024-07-23 17:10:17.763740] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x136f3e0 name raid_bdev1, state offline 00:16:22.469 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.469 17:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:22.728 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:22.728 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:22.728 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:22.728 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:22.987 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:22.987 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:23.247 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:23.247 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:23.506 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:16:23.506 [2024-07-23 17:10:18.918628] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:23.506 [2024-07-23 17:10:18.919950] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:23.506 [2024-07-23 17:10:18.920006] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:23.506 [2024-07-23 17:10:18.920043] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:23.506 [2024-07-23 17:10:18.920062] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:23.506 [2024-07-23 17:10:18.920072] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125d090 name raid_bdev1, state configuring 00:16:23.506 request: 00:16:23.506 { 00:16:23.506 "name": "raid_bdev1", 00:16:23.506 "raid_level": "concat", 00:16:23.506 "base_bdevs": [ 00:16:23.506 "malloc1", 00:16:23.506 "malloc2" 00:16:23.506 ], 00:16:23.506 "strip_size_kb": 64, 00:16:23.506 "superblock": false, 00:16:23.506 "method": "bdev_raid_create", 00:16:23.506 "req_id": 1 00:16:23.506 } 00:16:23.506 Got JSON-RPC error response 00:16:23.506 response: 00:16:23.506 { 00:16:23.506 "code": -17, 00:16:23.506 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:23.506 } 00:16:23.765 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:23.765 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:23.765 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:23.765 17:10:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:23.765 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.765 17:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:24.025 [2024-07-23 17:10:19.419902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:24.025 [2024-07-23 17:10:19.419947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:24.025 [2024-07-23 17:10:19.419966] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c3da0 00:16:24.025 [2024-07-23 17:10:19.419978] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:24.025 [2024-07-23 17:10:19.421512] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:24.025 [2024-07-23 17:10:19.421540] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:24.025 [2024-07-23 17:10:19.421602] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:24.025 [2024-07-23 17:10:19.421625] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:24.025 pt1 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.025 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.285 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.285 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.285 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.285 "name": "raid_bdev1", 00:16:24.285 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:24.285 "strip_size_kb": 64, 00:16:24.285 "state": "configuring", 00:16:24.285 "raid_level": "concat", 00:16:24.285 "superblock": true, 00:16:24.285 "num_base_bdevs": 2, 00:16:24.285 "num_base_bdevs_discovered": 1, 00:16:24.285 "num_base_bdevs_operational": 2, 00:16:24.285 "base_bdevs_list": [ 00:16:24.285 { 00:16:24.285 "name": "pt1", 00:16:24.285 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:24.285 "is_configured": true, 00:16:24.285 "data_offset": 2048, 00:16:24.285 "data_size": 63488 00:16:24.285 }, 00:16:24.285 { 00:16:24.285 "name": null, 00:16:24.285 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:24.285 "is_configured": false, 00:16:24.285 "data_offset": 2048, 00:16:24.285 "data_size": 63488 00:16:24.285 } 00:16:24.285 ] 00:16:24.285 }' 00:16:24.285 17:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.285 17:10:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.853 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:16:24.853 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:24.853 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:24.853 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:25.112 [2024-07-23 17:10:20.374456] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:25.112 [2024-07-23 17:10:20.374510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.112 [2024-07-23 17:10:20.374530] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c6b00 00:16:25.112 [2024-07-23 17:10:20.374542] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.112 [2024-07-23 17:10:20.374887] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.112 [2024-07-23 17:10:20.374921] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:25.112 [2024-07-23 17:10:20.374982] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:25.112 [2024-07-23 17:10:20.375000] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:25.112 [2024-07-23 17:10:20.375092] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13705d0 00:16:25.112 [2024-07-23 17:10:20.375102] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:16:25.112 [2024-07-23 17:10:20.375265] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13713f0 00:16:25.112 [2024-07-23 17:10:20.375385] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13705d0 00:16:25.112 [2024-07-23 17:10:20.375395] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13705d0 00:16:25.112 [2024-07-23 17:10:20.375497] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:25.112 pt2 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.112 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:25.410 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.410 "name": "raid_bdev1", 00:16:25.410 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:25.410 "strip_size_kb": 64, 00:16:25.410 "state": "online", 00:16:25.410 "raid_level": "concat", 00:16:25.410 "superblock": true, 00:16:25.410 "num_base_bdevs": 2, 00:16:25.410 "num_base_bdevs_discovered": 2, 00:16:25.410 "num_base_bdevs_operational": 2, 00:16:25.410 "base_bdevs_list": [ 00:16:25.410 { 00:16:25.410 "name": "pt1", 00:16:25.410 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:25.410 "is_configured": true, 00:16:25.410 "data_offset": 2048, 00:16:25.410 "data_size": 63488 00:16:25.410 }, 00:16:25.410 { 00:16:25.410 "name": "pt2", 00:16:25.410 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:25.410 "is_configured": true, 00:16:25.410 "data_offset": 2048, 00:16:25.410 "data_size": 63488 00:16:25.410 } 00:16:25.410 ] 00:16:25.410 }' 00:16:25.410 17:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.411 17:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:26.004 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:26.264 [2024-07-23 17:10:21.429493] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:26.264 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:26.264 "name": "raid_bdev1", 00:16:26.264 "aliases": [ 00:16:26.264 "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad" 00:16:26.264 ], 00:16:26.264 "product_name": "Raid Volume", 00:16:26.264 "block_size": 512, 00:16:26.264 "num_blocks": 126976, 00:16:26.264 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:26.264 "assigned_rate_limits": { 00:16:26.264 "rw_ios_per_sec": 0, 00:16:26.264 "rw_mbytes_per_sec": 0, 00:16:26.264 "r_mbytes_per_sec": 0, 00:16:26.264 "w_mbytes_per_sec": 0 00:16:26.264 }, 00:16:26.264 "claimed": false, 00:16:26.264 "zoned": false, 00:16:26.264 "supported_io_types": { 00:16:26.264 "read": true, 00:16:26.264 "write": true, 00:16:26.264 "unmap": true, 00:16:26.264 "flush": true, 00:16:26.264 "reset": true, 00:16:26.264 "nvme_admin": false, 00:16:26.264 "nvme_io": false, 00:16:26.264 "nvme_io_md": false, 00:16:26.264 "write_zeroes": true, 00:16:26.264 "zcopy": false, 00:16:26.264 "get_zone_info": false, 00:16:26.264 "zone_management": false, 00:16:26.264 "zone_append": false, 00:16:26.264 "compare": false, 00:16:26.264 "compare_and_write": false, 00:16:26.264 "abort": false, 00:16:26.264 "seek_hole": false, 00:16:26.264 "seek_data": false, 00:16:26.264 "copy": false, 00:16:26.264 "nvme_iov_md": false 00:16:26.264 }, 00:16:26.264 "memory_domains": [ 00:16:26.264 { 00:16:26.264 "dma_device_id": "system", 00:16:26.264 "dma_device_type": 1 00:16:26.264 }, 00:16:26.264 { 00:16:26.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.264 "dma_device_type": 2 00:16:26.264 }, 00:16:26.264 { 00:16:26.264 "dma_device_id": "system", 00:16:26.264 "dma_device_type": 1 00:16:26.264 }, 00:16:26.264 { 00:16:26.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.264 "dma_device_type": 2 00:16:26.264 } 00:16:26.264 ], 00:16:26.264 "driver_specific": { 00:16:26.264 "raid": { 00:16:26.264 "uuid": "b35343ff-8dc8-46c8-b8c3-8c3b5c032aad", 00:16:26.264 "strip_size_kb": 64, 00:16:26.264 "state": "online", 00:16:26.264 "raid_level": "concat", 00:16:26.264 "superblock": true, 00:16:26.264 "num_base_bdevs": 2, 00:16:26.264 "num_base_bdevs_discovered": 2, 00:16:26.264 "num_base_bdevs_operational": 2, 00:16:26.264 "base_bdevs_list": [ 00:16:26.264 { 00:16:26.264 "name": "pt1", 00:16:26.264 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:26.264 "is_configured": true, 00:16:26.264 "data_offset": 2048, 00:16:26.264 "data_size": 63488 00:16:26.264 }, 00:16:26.264 { 00:16:26.264 "name": "pt2", 00:16:26.264 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:26.264 "is_configured": true, 00:16:26.264 "data_offset": 2048, 00:16:26.264 "data_size": 63488 00:16:26.264 } 00:16:26.264 ] 00:16:26.264 } 00:16:26.264 } 00:16:26.264 }' 00:16:26.264 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:26.264 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:26.264 pt2' 00:16:26.264 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.264 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:26.264 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.523 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.523 "name": "pt1", 00:16:26.523 "aliases": [ 00:16:26.523 "00000000-0000-0000-0000-000000000001" 00:16:26.523 ], 00:16:26.523 "product_name": "passthru", 00:16:26.523 "block_size": 512, 00:16:26.523 "num_blocks": 65536, 00:16:26.523 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:26.523 "assigned_rate_limits": { 00:16:26.523 "rw_ios_per_sec": 0, 00:16:26.523 "rw_mbytes_per_sec": 0, 00:16:26.523 "r_mbytes_per_sec": 0, 00:16:26.523 "w_mbytes_per_sec": 0 00:16:26.523 }, 00:16:26.523 "claimed": true, 00:16:26.523 "claim_type": "exclusive_write", 00:16:26.523 "zoned": false, 00:16:26.523 "supported_io_types": { 00:16:26.523 "read": true, 00:16:26.523 "write": true, 00:16:26.523 "unmap": true, 00:16:26.523 "flush": true, 00:16:26.523 "reset": true, 00:16:26.523 "nvme_admin": false, 00:16:26.523 "nvme_io": false, 00:16:26.523 "nvme_io_md": false, 00:16:26.523 "write_zeroes": true, 00:16:26.523 "zcopy": true, 00:16:26.523 "get_zone_info": false, 00:16:26.523 "zone_management": false, 00:16:26.523 "zone_append": false, 00:16:26.523 "compare": false, 00:16:26.523 "compare_and_write": false, 00:16:26.523 "abort": true, 00:16:26.523 "seek_hole": false, 00:16:26.523 "seek_data": false, 00:16:26.523 "copy": true, 00:16:26.523 "nvme_iov_md": false 00:16:26.523 }, 00:16:26.523 "memory_domains": [ 00:16:26.523 { 00:16:26.523 "dma_device_id": "system", 00:16:26.523 "dma_device_type": 1 00:16:26.523 }, 00:16:26.523 { 00:16:26.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.524 "dma_device_type": 2 00:16:26.524 } 00:16:26.524 ], 00:16:26.524 "driver_specific": { 00:16:26.524 "passthru": { 00:16:26.524 "name": "pt1", 00:16:26.524 "base_bdev_name": "malloc1" 00:16:26.524 } 00:16:26.524 } 00:16:26.524 }' 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.524 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.783 17:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:26.783 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:27.042 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:27.042 "name": "pt2", 00:16:27.042 "aliases": [ 00:16:27.042 "00000000-0000-0000-0000-000000000002" 00:16:27.042 ], 00:16:27.042 "product_name": "passthru", 00:16:27.042 "block_size": 512, 00:16:27.042 "num_blocks": 65536, 00:16:27.042 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:27.042 "assigned_rate_limits": { 00:16:27.042 "rw_ios_per_sec": 0, 00:16:27.042 "rw_mbytes_per_sec": 0, 00:16:27.042 "r_mbytes_per_sec": 0, 00:16:27.042 "w_mbytes_per_sec": 0 00:16:27.042 }, 00:16:27.042 "claimed": true, 00:16:27.042 "claim_type": "exclusive_write", 00:16:27.042 "zoned": false, 00:16:27.042 "supported_io_types": { 00:16:27.042 "read": true, 00:16:27.042 "write": true, 00:16:27.042 "unmap": true, 00:16:27.042 "flush": true, 00:16:27.042 "reset": true, 00:16:27.042 "nvme_admin": false, 00:16:27.042 "nvme_io": false, 00:16:27.042 "nvme_io_md": false, 00:16:27.042 "write_zeroes": true, 00:16:27.042 "zcopy": true, 00:16:27.042 "get_zone_info": false, 00:16:27.042 "zone_management": false, 00:16:27.042 "zone_append": false, 00:16:27.042 "compare": false, 00:16:27.042 "compare_and_write": false, 00:16:27.042 "abort": true, 00:16:27.042 "seek_hole": false, 00:16:27.042 "seek_data": false, 00:16:27.042 "copy": true, 00:16:27.042 "nvme_iov_md": false 00:16:27.042 }, 00:16:27.042 "memory_domains": [ 00:16:27.042 { 00:16:27.042 "dma_device_id": "system", 00:16:27.042 "dma_device_type": 1 00:16:27.042 }, 00:16:27.042 { 00:16:27.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.042 "dma_device_type": 2 00:16:27.042 } 00:16:27.042 ], 00:16:27.042 "driver_specific": { 00:16:27.042 "passthru": { 00:16:27.042 "name": "pt2", 00:16:27.042 "base_bdev_name": "malloc2" 00:16:27.042 } 00:16:27.042 } 00:16:27.042 }' 00:16:27.042 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.042 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.042 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:27.042 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.042 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:27.301 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:27.560 [2024-07-23 17:10:22.897377] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b35343ff-8dc8-46c8-b8c3-8c3b5c032aad '!=' b35343ff-8dc8-46c8-b8c3-8c3b5c032aad ']' 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4117339 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4117339 ']' 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4117339 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4117339 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4117339' 00:16:27.560 killing process with pid 4117339 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4117339 00:16:27.560 [2024-07-23 17:10:22.975613] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:27.560 [2024-07-23 17:10:22.975669] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:27.560 [2024-07-23 17:10:22.975708] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:27.560 [2024-07-23 17:10:22.975719] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13705d0 name raid_bdev1, state offline 00:16:27.560 17:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4117339 00:16:27.820 [2024-07-23 17:10:22.993523] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:27.820 17:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:27.820 00:16:27.820 real 0m12.730s 00:16:27.820 user 0m22.829s 00:16:27.820 sys 0m2.271s 00:16:27.820 17:10:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:27.820 17:10:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.820 ************************************ 00:16:27.820 END TEST raid_superblock_test 00:16:27.820 ************************************ 00:16:28.079 17:10:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:28.079 17:10:23 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:16:28.079 17:10:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:28.079 17:10:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:28.079 17:10:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:28.079 ************************************ 00:16:28.079 START TEST raid_read_error_test 00:16:28.079 ************************************ 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:28.079 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zA64ZI445k 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4119302 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4119302 /var/tmp/spdk-raid.sock 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4119302 ']' 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:28.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:28.080 17:10:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.080 [2024-07-23 17:10:23.371633] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:16:28.080 [2024-07-23 17:10:23.371703] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4119302 ] 00:16:28.339 [2024-07-23 17:10:23.505062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:28.339 [2024-07-23 17:10:23.560799] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:28.339 [2024-07-23 17:10:23.632494] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.339 [2024-07-23 17:10:23.632534] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.907 17:10:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:28.907 17:10:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:28.907 17:10:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:28.907 17:10:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:29.475 BaseBdev1_malloc 00:16:29.475 17:10:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:29.734 true 00:16:29.734 17:10:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:30.301 [2024-07-23 17:10:25.547111] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:30.301 [2024-07-23 17:10:25.547160] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:30.301 [2024-07-23 17:10:25.547180] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dd55c0 00:16:30.301 [2024-07-23 17:10:25.547193] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:30.301 [2024-07-23 17:10:25.548878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:30.301 [2024-07-23 17:10:25.548913] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:30.301 BaseBdev1 00:16:30.302 17:10:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:30.302 17:10:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:30.561 BaseBdev2_malloc 00:16:30.561 17:10:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:31.128 true 00:16:31.128 17:10:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:31.387 [2024-07-23 17:10:26.571705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:31.387 [2024-07-23 17:10:26.571752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.387 [2024-07-23 17:10:26.571775] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dcf620 00:16:31.387 [2024-07-23 17:10:26.571788] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.387 [2024-07-23 17:10:26.573351] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.387 [2024-07-23 17:10:26.573379] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:31.387 BaseBdev2 00:16:31.387 17:10:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:16:31.956 [2024-07-23 17:10:27.073050] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.956 [2024-07-23 17:10:27.074403] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:31.956 [2024-07-23 17:10:27.074581] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c23610 00:16:31.956 [2024-07-23 17:10:27.074594] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:16:31.956 [2024-07-23 17:10:27.074795] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbda80 00:16:31.956 [2024-07-23 17:10:27.074952] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c23610 00:16:31.956 [2024-07-23 17:10:27.074962] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c23610 00:16:31.956 [2024-07-23 17:10:27.075072] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.956 "name": "raid_bdev1", 00:16:31.956 "uuid": "00e6b327-a6a4-4914-b2cb-8e63369b144e", 00:16:31.956 "strip_size_kb": 64, 00:16:31.956 "state": "online", 00:16:31.956 "raid_level": "concat", 00:16:31.956 "superblock": true, 00:16:31.956 "num_base_bdevs": 2, 00:16:31.956 "num_base_bdevs_discovered": 2, 00:16:31.956 "num_base_bdevs_operational": 2, 00:16:31.956 "base_bdevs_list": [ 00:16:31.956 { 00:16:31.956 "name": "BaseBdev1", 00:16:31.956 "uuid": "afdafd74-de92-520c-812f-9d83d2495099", 00:16:31.956 "is_configured": true, 00:16:31.956 "data_offset": 2048, 00:16:31.956 "data_size": 63488 00:16:31.956 }, 00:16:31.956 { 00:16:31.956 "name": "BaseBdev2", 00:16:31.956 "uuid": "53718515-e88d-52fa-8eca-227f285b1b27", 00:16:31.956 "is_configured": true, 00:16:31.956 "data_offset": 2048, 00:16:31.956 "data_size": 63488 00:16:31.956 } 00:16:31.956 ] 00:16:31.956 }' 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.956 17:10:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.525 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:32.525 17:10:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:32.784 [2024-07-23 17:10:28.051918] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbd930 00:16:33.723 17:10:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.983 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:34.243 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.243 "name": "raid_bdev1", 00:16:34.243 "uuid": "00e6b327-a6a4-4914-b2cb-8e63369b144e", 00:16:34.243 "strip_size_kb": 64, 00:16:34.243 "state": "online", 00:16:34.243 "raid_level": "concat", 00:16:34.243 "superblock": true, 00:16:34.243 "num_base_bdevs": 2, 00:16:34.243 "num_base_bdevs_discovered": 2, 00:16:34.243 "num_base_bdevs_operational": 2, 00:16:34.243 "base_bdevs_list": [ 00:16:34.243 { 00:16:34.243 "name": "BaseBdev1", 00:16:34.243 "uuid": "afdafd74-de92-520c-812f-9d83d2495099", 00:16:34.243 "is_configured": true, 00:16:34.243 "data_offset": 2048, 00:16:34.243 "data_size": 63488 00:16:34.243 }, 00:16:34.243 { 00:16:34.243 "name": "BaseBdev2", 00:16:34.243 "uuid": "53718515-e88d-52fa-8eca-227f285b1b27", 00:16:34.243 "is_configured": true, 00:16:34.243 "data_offset": 2048, 00:16:34.243 "data_size": 63488 00:16:34.243 } 00:16:34.243 ] 00:16:34.243 }' 00:16:34.243 17:10:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.243 17:10:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.812 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:34.813 [2024-07-23 17:10:30.189073] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:34.813 [2024-07-23 17:10:30.189115] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:34.813 [2024-07-23 17:10:30.192261] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:34.813 [2024-07-23 17:10:30.192294] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.813 [2024-07-23 17:10:30.192322] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:34.813 [2024-07-23 17:10:30.192333] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c23610 name raid_bdev1, state offline 00:16:34.813 0 00:16:34.813 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4119302 00:16:34.813 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4119302 ']' 00:16:34.813 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4119302 00:16:34.813 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:34.813 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:34.813 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4119302 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4119302' 00:16:35.073 killing process with pid 4119302 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4119302 00:16:35.073 [2024-07-23 17:10:30.268984] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4119302 00:16:35.073 [2024-07-23 17:10:30.279362] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zA64ZI445k 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:16:35.073 00:16:35.073 real 0m7.199s 00:16:35.073 user 0m11.539s 00:16:35.073 sys 0m1.240s 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:35.073 17:10:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.073 ************************************ 00:16:35.073 END TEST raid_read_error_test 00:16:35.073 ************************************ 00:16:35.333 17:10:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:35.333 17:10:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:16:35.333 17:10:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:35.333 17:10:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:35.333 17:10:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:35.333 ************************************ 00:16:35.333 START TEST raid_write_error_test 00:16:35.333 ************************************ 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.P1I8EzP5mK 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4120294 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4120294 /var/tmp/spdk-raid.sock 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4120294 ']' 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:35.333 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:35.333 17:10:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.333 [2024-07-23 17:10:30.700450] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:16:35.333 [2024-07-23 17:10:30.700586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4120294 ] 00:16:35.593 [2024-07-23 17:10:30.904589] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:35.593 [2024-07-23 17:10:30.958114] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:35.852 [2024-07-23 17:10:31.027069] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:35.852 [2024-07-23 17:10:31.027104] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:36.790 17:10:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:36.790 17:10:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:36.790 17:10:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:36.790 17:10:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:37.049 BaseBdev1_malloc 00:16:37.049 17:10:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:37.618 true 00:16:37.618 17:10:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:37.877 [2024-07-23 17:10:33.122379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:37.877 [2024-07-23 17:10:33.122424] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:37.877 [2024-07-23 17:10:33.122445] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xce65c0 00:16:37.877 [2024-07-23 17:10:33.122458] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:37.877 [2024-07-23 17:10:33.124128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:37.877 [2024-07-23 17:10:33.124155] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:37.878 BaseBdev1 00:16:37.878 17:10:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:37.878 17:10:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:38.447 BaseBdev2_malloc 00:16:38.447 17:10:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:38.706 true 00:16:38.706 17:10:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:38.964 [2024-07-23 17:10:34.150955] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:38.964 [2024-07-23 17:10:34.151001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:38.964 [2024-07-23 17:10:34.151024] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xce0620 00:16:38.964 [2024-07-23 17:10:34.151036] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:38.964 [2024-07-23 17:10:34.152639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:38.964 [2024-07-23 17:10:34.152668] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:38.964 BaseBdev2 00:16:38.964 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:16:38.964 [2024-07-23 17:10:34.383593] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.964 [2024-07-23 17:10:34.384871] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:38.964 [2024-07-23 17:10:34.385053] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xb34610 00:16:38.964 [2024-07-23 17:10:34.385067] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:16:38.964 [2024-07-23 17:10:34.385260] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbcea80 00:16:38.964 [2024-07-23 17:10:34.385403] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb34610 00:16:38.964 [2024-07-23 17:10:34.385412] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb34610 00:16:38.964 [2024-07-23 17:10:34.385519] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.223 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:39.482 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.482 "name": "raid_bdev1", 00:16:39.482 "uuid": "d5a2553f-c0ab-4fe8-8fad-5281bffb765c", 00:16:39.482 "strip_size_kb": 64, 00:16:39.482 "state": "online", 00:16:39.482 "raid_level": "concat", 00:16:39.482 "superblock": true, 00:16:39.482 "num_base_bdevs": 2, 00:16:39.482 "num_base_bdevs_discovered": 2, 00:16:39.482 "num_base_bdevs_operational": 2, 00:16:39.482 "base_bdevs_list": [ 00:16:39.482 { 00:16:39.482 "name": "BaseBdev1", 00:16:39.482 "uuid": "bb0d7834-b8e3-5e2f-81d0-7f1d4bcb5e30", 00:16:39.482 "is_configured": true, 00:16:39.482 "data_offset": 2048, 00:16:39.482 "data_size": 63488 00:16:39.482 }, 00:16:39.482 { 00:16:39.482 "name": "BaseBdev2", 00:16:39.482 "uuid": "d700a95b-1920-5e1a-b9e6-accecf22230c", 00:16:39.482 "is_configured": true, 00:16:39.482 "data_offset": 2048, 00:16:39.482 "data_size": 63488 00:16:39.482 } 00:16:39.482 ] 00:16:39.482 }' 00:16:39.482 17:10:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.482 17:10:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:40.052 17:10:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:40.052 17:10:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:40.052 [2024-07-23 17:10:35.338386] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbce930 00:16:41.028 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.288 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:41.548 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.548 "name": "raid_bdev1", 00:16:41.548 "uuid": "d5a2553f-c0ab-4fe8-8fad-5281bffb765c", 00:16:41.548 "strip_size_kb": 64, 00:16:41.548 "state": "online", 00:16:41.548 "raid_level": "concat", 00:16:41.548 "superblock": true, 00:16:41.548 "num_base_bdevs": 2, 00:16:41.548 "num_base_bdevs_discovered": 2, 00:16:41.548 "num_base_bdevs_operational": 2, 00:16:41.548 "base_bdevs_list": [ 00:16:41.548 { 00:16:41.548 "name": "BaseBdev1", 00:16:41.548 "uuid": "bb0d7834-b8e3-5e2f-81d0-7f1d4bcb5e30", 00:16:41.548 "is_configured": true, 00:16:41.548 "data_offset": 2048, 00:16:41.548 "data_size": 63488 00:16:41.548 }, 00:16:41.548 { 00:16:41.548 "name": "BaseBdev2", 00:16:41.548 "uuid": "d700a95b-1920-5e1a-b9e6-accecf22230c", 00:16:41.548 "is_configured": true, 00:16:41.548 "data_offset": 2048, 00:16:41.548 "data_size": 63488 00:16:41.548 } 00:16:41.548 ] 00:16:41.548 }' 00:16:41.548 17:10:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.548 17:10:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:42.484 [2024-07-23 17:10:37.838913] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:42.484 [2024-07-23 17:10:37.838952] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:42.484 [2024-07-23 17:10:37.842158] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:42.484 [2024-07-23 17:10:37.842189] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:42.484 [2024-07-23 17:10:37.842216] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:42.484 [2024-07-23 17:10:37.842227] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb34610 name raid_bdev1, state offline 00:16:42.484 0 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4120294 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4120294 ']' 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4120294 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:42.484 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4120294 00:16:42.743 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:42.743 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:42.743 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4120294' 00:16:42.743 killing process with pid 4120294 00:16:42.743 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4120294 00:16:42.743 [2024-07-23 17:10:37.924474] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:42.743 17:10:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4120294 00:16:42.743 [2024-07-23 17:10:37.935459] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.P1I8EzP5mK 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:16:42.743 00:16:42.743 real 0m7.586s 00:16:42.743 user 0m12.199s 00:16:42.743 sys 0m1.338s 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:42.743 17:10:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.743 ************************************ 00:16:42.743 END TEST raid_write_error_test 00:16:42.743 ************************************ 00:16:43.001 17:10:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:43.001 17:10:38 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:43.001 17:10:38 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:16:43.001 17:10:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:43.001 17:10:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:43.001 17:10:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:43.001 ************************************ 00:16:43.001 START TEST raid_state_function_test 00:16:43.001 ************************************ 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4121421 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4121421' 00:16:43.001 Process raid pid: 4121421 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4121421 /var/tmp/spdk-raid.sock 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4121421 ']' 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:43.001 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:43.001 17:10:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.001 [2024-07-23 17:10:38.324820] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:16:43.001 [2024-07-23 17:10:38.324907] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:43.260 [2024-07-23 17:10:38.459901] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:43.260 [2024-07-23 17:10:38.514897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:43.260 [2024-07-23 17:10:38.574273] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:43.260 [2024-07-23 17:10:38.574304] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:43.825 17:10:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:43.825 17:10:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:43.825 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:44.083 [2024-07-23 17:10:39.422057] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:44.083 [2024-07-23 17:10:39.422097] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:44.083 [2024-07-23 17:10:39.422107] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:44.083 [2024-07-23 17:10:39.422119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.083 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.341 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.341 "name": "Existed_Raid", 00:16:44.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.341 "strip_size_kb": 0, 00:16:44.341 "state": "configuring", 00:16:44.341 "raid_level": "raid1", 00:16:44.341 "superblock": false, 00:16:44.341 "num_base_bdevs": 2, 00:16:44.341 "num_base_bdevs_discovered": 0, 00:16:44.341 "num_base_bdevs_operational": 2, 00:16:44.341 "base_bdevs_list": [ 00:16:44.341 { 00:16:44.341 "name": "BaseBdev1", 00:16:44.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.341 "is_configured": false, 00:16:44.341 "data_offset": 0, 00:16:44.341 "data_size": 0 00:16:44.341 }, 00:16:44.341 { 00:16:44.341 "name": "BaseBdev2", 00:16:44.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.341 "is_configured": false, 00:16:44.341 "data_offset": 0, 00:16:44.341 "data_size": 0 00:16:44.341 } 00:16:44.341 ] 00:16:44.341 }' 00:16:44.341 17:10:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.341 17:10:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.909 17:10:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:45.167 [2024-07-23 17:10:40.528866] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:45.167 [2024-07-23 17:10:40.528918] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa53f0 name Existed_Raid, state configuring 00:16:45.167 17:10:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:45.425 [2024-07-23 17:10:40.781528] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:45.425 [2024-07-23 17:10:40.781556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:45.426 [2024-07-23 17:10:40.781566] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:45.426 [2024-07-23 17:10:40.781578] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:45.426 17:10:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:45.684 [2024-07-23 17:10:41.040079] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.684 BaseBdev1 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:45.684 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.943 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:46.201 [ 00:16:46.201 { 00:16:46.201 "name": "BaseBdev1", 00:16:46.201 "aliases": [ 00:16:46.201 "f8d9cf86-8970-4ddb-889a-0577e16a8f43" 00:16:46.201 ], 00:16:46.201 "product_name": "Malloc disk", 00:16:46.201 "block_size": 512, 00:16:46.201 "num_blocks": 65536, 00:16:46.201 "uuid": "f8d9cf86-8970-4ddb-889a-0577e16a8f43", 00:16:46.201 "assigned_rate_limits": { 00:16:46.201 "rw_ios_per_sec": 0, 00:16:46.201 "rw_mbytes_per_sec": 0, 00:16:46.201 "r_mbytes_per_sec": 0, 00:16:46.201 "w_mbytes_per_sec": 0 00:16:46.201 }, 00:16:46.201 "claimed": true, 00:16:46.201 "claim_type": "exclusive_write", 00:16:46.201 "zoned": false, 00:16:46.201 "supported_io_types": { 00:16:46.201 "read": true, 00:16:46.201 "write": true, 00:16:46.201 "unmap": true, 00:16:46.201 "flush": true, 00:16:46.201 "reset": true, 00:16:46.201 "nvme_admin": false, 00:16:46.201 "nvme_io": false, 00:16:46.201 "nvme_io_md": false, 00:16:46.201 "write_zeroes": true, 00:16:46.201 "zcopy": true, 00:16:46.201 "get_zone_info": false, 00:16:46.201 "zone_management": false, 00:16:46.201 "zone_append": false, 00:16:46.202 "compare": false, 00:16:46.202 "compare_and_write": false, 00:16:46.202 "abort": true, 00:16:46.202 "seek_hole": false, 00:16:46.202 "seek_data": false, 00:16:46.202 "copy": true, 00:16:46.202 "nvme_iov_md": false 00:16:46.202 }, 00:16:46.202 "memory_domains": [ 00:16:46.202 { 00:16:46.202 "dma_device_id": "system", 00:16:46.202 "dma_device_type": 1 00:16:46.202 }, 00:16:46.202 { 00:16:46.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.202 "dma_device_type": 2 00:16:46.202 } 00:16:46.202 ], 00:16:46.202 "driver_specific": {} 00:16:46.202 } 00:16:46.202 ] 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.202 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.461 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.461 "name": "Existed_Raid", 00:16:46.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.461 "strip_size_kb": 0, 00:16:46.461 "state": "configuring", 00:16:46.461 "raid_level": "raid1", 00:16:46.461 "superblock": false, 00:16:46.461 "num_base_bdevs": 2, 00:16:46.461 "num_base_bdevs_discovered": 1, 00:16:46.461 "num_base_bdevs_operational": 2, 00:16:46.461 "base_bdevs_list": [ 00:16:46.461 { 00:16:46.461 "name": "BaseBdev1", 00:16:46.461 "uuid": "f8d9cf86-8970-4ddb-889a-0577e16a8f43", 00:16:46.461 "is_configured": true, 00:16:46.461 "data_offset": 0, 00:16:46.461 "data_size": 65536 00:16:46.461 }, 00:16:46.461 { 00:16:46.461 "name": "BaseBdev2", 00:16:46.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.461 "is_configured": false, 00:16:46.461 "data_offset": 0, 00:16:46.461 "data_size": 0 00:16:46.461 } 00:16:46.461 ] 00:16:46.461 }' 00:16:46.461 17:10:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.461 17:10:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.028 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:47.287 [2024-07-23 17:10:42.664410] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:47.287 [2024-07-23 17:10:42.664456] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa4d20 name Existed_Raid, state configuring 00:16:47.287 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:47.547 [2024-07-23 17:10:42.909080] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:47.547 [2024-07-23 17:10:42.910577] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:47.547 [2024-07-23 17:10:42.910612] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.547 17:10:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.806 17:10:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.806 "name": "Existed_Raid", 00:16:47.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.806 "strip_size_kb": 0, 00:16:47.806 "state": "configuring", 00:16:47.806 "raid_level": "raid1", 00:16:47.806 "superblock": false, 00:16:47.806 "num_base_bdevs": 2, 00:16:47.806 "num_base_bdevs_discovered": 1, 00:16:47.806 "num_base_bdevs_operational": 2, 00:16:47.806 "base_bdevs_list": [ 00:16:47.806 { 00:16:47.806 "name": "BaseBdev1", 00:16:47.806 "uuid": "f8d9cf86-8970-4ddb-889a-0577e16a8f43", 00:16:47.806 "is_configured": true, 00:16:47.806 "data_offset": 0, 00:16:47.806 "data_size": 65536 00:16:47.806 }, 00:16:47.806 { 00:16:47.806 "name": "BaseBdev2", 00:16:47.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.806 "is_configured": false, 00:16:47.806 "data_offset": 0, 00:16:47.806 "data_size": 0 00:16:47.806 } 00:16:47.806 ] 00:16:47.806 }' 00:16:47.807 17:10:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.807 17:10:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.374 17:10:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:48.632 [2024-07-23 17:10:44.019408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:48.632 [2024-07-23 17:10:44.019453] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1aa4970 00:16:48.632 [2024-07-23 17:10:44.019462] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:48.632 [2024-07-23 17:10:44.019652] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c495e0 00:16:48.632 [2024-07-23 17:10:44.019772] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1aa4970 00:16:48.632 [2024-07-23 17:10:44.019782] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1aa4970 00:16:48.632 [2024-07-23 17:10:44.019955] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:48.632 BaseBdev2 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:48.632 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.890 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:49.149 [ 00:16:49.149 { 00:16:49.149 "name": "BaseBdev2", 00:16:49.149 "aliases": [ 00:16:49.149 "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3" 00:16:49.149 ], 00:16:49.149 "product_name": "Malloc disk", 00:16:49.149 "block_size": 512, 00:16:49.149 "num_blocks": 65536, 00:16:49.149 "uuid": "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3", 00:16:49.149 "assigned_rate_limits": { 00:16:49.149 "rw_ios_per_sec": 0, 00:16:49.149 "rw_mbytes_per_sec": 0, 00:16:49.149 "r_mbytes_per_sec": 0, 00:16:49.149 "w_mbytes_per_sec": 0 00:16:49.149 }, 00:16:49.149 "claimed": true, 00:16:49.149 "claim_type": "exclusive_write", 00:16:49.149 "zoned": false, 00:16:49.150 "supported_io_types": { 00:16:49.150 "read": true, 00:16:49.150 "write": true, 00:16:49.150 "unmap": true, 00:16:49.150 "flush": true, 00:16:49.150 "reset": true, 00:16:49.150 "nvme_admin": false, 00:16:49.150 "nvme_io": false, 00:16:49.150 "nvme_io_md": false, 00:16:49.150 "write_zeroes": true, 00:16:49.150 "zcopy": true, 00:16:49.150 "get_zone_info": false, 00:16:49.150 "zone_management": false, 00:16:49.150 "zone_append": false, 00:16:49.150 "compare": false, 00:16:49.150 "compare_and_write": false, 00:16:49.150 "abort": true, 00:16:49.150 "seek_hole": false, 00:16:49.150 "seek_data": false, 00:16:49.150 "copy": true, 00:16:49.150 "nvme_iov_md": false 00:16:49.150 }, 00:16:49.150 "memory_domains": [ 00:16:49.150 { 00:16:49.150 "dma_device_id": "system", 00:16:49.150 "dma_device_type": 1 00:16:49.150 }, 00:16:49.150 { 00:16:49.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.150 "dma_device_type": 2 00:16:49.150 } 00:16:49.150 ], 00:16:49.150 "driver_specific": {} 00:16:49.150 } 00:16:49.150 ] 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.150 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.408 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.408 "name": "Existed_Raid", 00:16:49.408 "uuid": "467b9c77-8d27-475e-9621-7e42f20605d7", 00:16:49.408 "strip_size_kb": 0, 00:16:49.409 "state": "online", 00:16:49.409 "raid_level": "raid1", 00:16:49.409 "superblock": false, 00:16:49.409 "num_base_bdevs": 2, 00:16:49.409 "num_base_bdevs_discovered": 2, 00:16:49.409 "num_base_bdevs_operational": 2, 00:16:49.409 "base_bdevs_list": [ 00:16:49.409 { 00:16:49.409 "name": "BaseBdev1", 00:16:49.409 "uuid": "f8d9cf86-8970-4ddb-889a-0577e16a8f43", 00:16:49.409 "is_configured": true, 00:16:49.409 "data_offset": 0, 00:16:49.409 "data_size": 65536 00:16:49.409 }, 00:16:49.409 { 00:16:49.409 "name": "BaseBdev2", 00:16:49.409 "uuid": "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3", 00:16:49.409 "is_configured": true, 00:16:49.409 "data_offset": 0, 00:16:49.409 "data_size": 65536 00:16:49.409 } 00:16:49.409 ] 00:16:49.409 }' 00:16:49.409 17:10:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.409 17:10:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.975 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:49.975 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:49.975 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:49.975 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:49.976 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:50.234 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:50.234 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:50.234 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:50.234 [2024-07-23 17:10:45.623934] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:50.234 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:50.234 "name": "Existed_Raid", 00:16:50.234 "aliases": [ 00:16:50.234 "467b9c77-8d27-475e-9621-7e42f20605d7" 00:16:50.234 ], 00:16:50.234 "product_name": "Raid Volume", 00:16:50.234 "block_size": 512, 00:16:50.234 "num_blocks": 65536, 00:16:50.234 "uuid": "467b9c77-8d27-475e-9621-7e42f20605d7", 00:16:50.234 "assigned_rate_limits": { 00:16:50.234 "rw_ios_per_sec": 0, 00:16:50.234 "rw_mbytes_per_sec": 0, 00:16:50.234 "r_mbytes_per_sec": 0, 00:16:50.234 "w_mbytes_per_sec": 0 00:16:50.234 }, 00:16:50.234 "claimed": false, 00:16:50.234 "zoned": false, 00:16:50.234 "supported_io_types": { 00:16:50.234 "read": true, 00:16:50.234 "write": true, 00:16:50.234 "unmap": false, 00:16:50.234 "flush": false, 00:16:50.234 "reset": true, 00:16:50.234 "nvme_admin": false, 00:16:50.234 "nvme_io": false, 00:16:50.234 "nvme_io_md": false, 00:16:50.234 "write_zeroes": true, 00:16:50.234 "zcopy": false, 00:16:50.234 "get_zone_info": false, 00:16:50.234 "zone_management": false, 00:16:50.234 "zone_append": false, 00:16:50.234 "compare": false, 00:16:50.234 "compare_and_write": false, 00:16:50.234 "abort": false, 00:16:50.234 "seek_hole": false, 00:16:50.234 "seek_data": false, 00:16:50.234 "copy": false, 00:16:50.234 "nvme_iov_md": false 00:16:50.234 }, 00:16:50.234 "memory_domains": [ 00:16:50.234 { 00:16:50.234 "dma_device_id": "system", 00:16:50.234 "dma_device_type": 1 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.234 "dma_device_type": 2 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "system", 00:16:50.234 "dma_device_type": 1 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.234 "dma_device_type": 2 00:16:50.234 } 00:16:50.234 ], 00:16:50.234 "driver_specific": { 00:16:50.234 "raid": { 00:16:50.234 "uuid": "467b9c77-8d27-475e-9621-7e42f20605d7", 00:16:50.234 "strip_size_kb": 0, 00:16:50.234 "state": "online", 00:16:50.234 "raid_level": "raid1", 00:16:50.234 "superblock": false, 00:16:50.234 "num_base_bdevs": 2, 00:16:50.234 "num_base_bdevs_discovered": 2, 00:16:50.234 "num_base_bdevs_operational": 2, 00:16:50.234 "base_bdevs_list": [ 00:16:50.234 { 00:16:50.234 "name": "BaseBdev1", 00:16:50.234 "uuid": "f8d9cf86-8970-4ddb-889a-0577e16a8f43", 00:16:50.234 "is_configured": true, 00:16:50.234 "data_offset": 0, 00:16:50.234 "data_size": 65536 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "name": "BaseBdev2", 00:16:50.234 "uuid": "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3", 00:16:50.234 "is_configured": true, 00:16:50.234 "data_offset": 0, 00:16:50.234 "data_size": 65536 00:16:50.234 } 00:16:50.234 ] 00:16:50.234 } 00:16:50.234 } 00:16:50.234 }' 00:16:50.234 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:50.493 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:50.493 BaseBdev2' 00:16:50.493 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:50.493 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:50.493 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:50.751 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:50.752 "name": "BaseBdev1", 00:16:50.752 "aliases": [ 00:16:50.752 "f8d9cf86-8970-4ddb-889a-0577e16a8f43" 00:16:50.752 ], 00:16:50.752 "product_name": "Malloc disk", 00:16:50.752 "block_size": 512, 00:16:50.752 "num_blocks": 65536, 00:16:50.752 "uuid": "f8d9cf86-8970-4ddb-889a-0577e16a8f43", 00:16:50.752 "assigned_rate_limits": { 00:16:50.752 "rw_ios_per_sec": 0, 00:16:50.752 "rw_mbytes_per_sec": 0, 00:16:50.752 "r_mbytes_per_sec": 0, 00:16:50.752 "w_mbytes_per_sec": 0 00:16:50.752 }, 00:16:50.752 "claimed": true, 00:16:50.752 "claim_type": "exclusive_write", 00:16:50.752 "zoned": false, 00:16:50.752 "supported_io_types": { 00:16:50.752 "read": true, 00:16:50.752 "write": true, 00:16:50.752 "unmap": true, 00:16:50.752 "flush": true, 00:16:50.752 "reset": true, 00:16:50.752 "nvme_admin": false, 00:16:50.752 "nvme_io": false, 00:16:50.752 "nvme_io_md": false, 00:16:50.752 "write_zeroes": true, 00:16:50.752 "zcopy": true, 00:16:50.752 "get_zone_info": false, 00:16:50.752 "zone_management": false, 00:16:50.752 "zone_append": false, 00:16:50.752 "compare": false, 00:16:50.752 "compare_and_write": false, 00:16:50.752 "abort": true, 00:16:50.752 "seek_hole": false, 00:16:50.752 "seek_data": false, 00:16:50.752 "copy": true, 00:16:50.752 "nvme_iov_md": false 00:16:50.752 }, 00:16:50.752 "memory_domains": [ 00:16:50.752 { 00:16:50.752 "dma_device_id": "system", 00:16:50.752 "dma_device_type": 1 00:16:50.752 }, 00:16:50.752 { 00:16:50.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.752 "dma_device_type": 2 00:16:50.752 } 00:16:50.752 ], 00:16:50.752 "driver_specific": {} 00:16:50.752 }' 00:16:50.752 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.752 17:10:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.752 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:50.752 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.752 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.752 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:50.752 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.752 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:51.010 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.268 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.268 "name": "BaseBdev2", 00:16:51.268 "aliases": [ 00:16:51.268 "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3" 00:16:51.268 ], 00:16:51.268 "product_name": "Malloc disk", 00:16:51.268 "block_size": 512, 00:16:51.268 "num_blocks": 65536, 00:16:51.268 "uuid": "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3", 00:16:51.268 "assigned_rate_limits": { 00:16:51.268 "rw_ios_per_sec": 0, 00:16:51.268 "rw_mbytes_per_sec": 0, 00:16:51.268 "r_mbytes_per_sec": 0, 00:16:51.268 "w_mbytes_per_sec": 0 00:16:51.268 }, 00:16:51.268 "claimed": true, 00:16:51.268 "claim_type": "exclusive_write", 00:16:51.268 "zoned": false, 00:16:51.268 "supported_io_types": { 00:16:51.268 "read": true, 00:16:51.268 "write": true, 00:16:51.268 "unmap": true, 00:16:51.268 "flush": true, 00:16:51.268 "reset": true, 00:16:51.268 "nvme_admin": false, 00:16:51.268 "nvme_io": false, 00:16:51.268 "nvme_io_md": false, 00:16:51.268 "write_zeroes": true, 00:16:51.268 "zcopy": true, 00:16:51.268 "get_zone_info": false, 00:16:51.268 "zone_management": false, 00:16:51.268 "zone_append": false, 00:16:51.268 "compare": false, 00:16:51.268 "compare_and_write": false, 00:16:51.268 "abort": true, 00:16:51.268 "seek_hole": false, 00:16:51.268 "seek_data": false, 00:16:51.268 "copy": true, 00:16:51.268 "nvme_iov_md": false 00:16:51.268 }, 00:16:51.268 "memory_domains": [ 00:16:51.268 { 00:16:51.268 "dma_device_id": "system", 00:16:51.268 "dma_device_type": 1 00:16:51.268 }, 00:16:51.268 { 00:16:51.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.268 "dma_device_type": 2 00:16:51.268 } 00:16:51.268 ], 00:16:51.268 "driver_specific": {} 00:16:51.268 }' 00:16:51.268 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.268 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.268 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.268 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.268 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.526 17:10:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:51.785 [2024-07-23 17:10:47.123698] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.785 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.043 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.043 "name": "Existed_Raid", 00:16:52.043 "uuid": "467b9c77-8d27-475e-9621-7e42f20605d7", 00:16:52.043 "strip_size_kb": 0, 00:16:52.043 "state": "online", 00:16:52.043 "raid_level": "raid1", 00:16:52.043 "superblock": false, 00:16:52.043 "num_base_bdevs": 2, 00:16:52.043 "num_base_bdevs_discovered": 1, 00:16:52.043 "num_base_bdevs_operational": 1, 00:16:52.043 "base_bdevs_list": [ 00:16:52.043 { 00:16:52.043 "name": null, 00:16:52.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.043 "is_configured": false, 00:16:52.043 "data_offset": 0, 00:16:52.043 "data_size": 65536 00:16:52.043 }, 00:16:52.043 { 00:16:52.043 "name": "BaseBdev2", 00:16:52.043 "uuid": "02c1bc08-f7cb-4d01-ac60-44a906bdd1a3", 00:16:52.043 "is_configured": true, 00:16:52.043 "data_offset": 0, 00:16:52.043 "data_size": 65536 00:16:52.043 } 00:16:52.043 ] 00:16:52.043 }' 00:16:52.043 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.043 17:10:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.609 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:52.609 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:52.609 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:52.609 17:10:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.867 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:52.867 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:52.867 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:53.125 [2024-07-23 17:10:48.388948] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:53.125 [2024-07-23 17:10:48.389029] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:53.125 [2024-07-23 17:10:48.399861] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:53.125 [2024-07-23 17:10:48.399912] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:53.125 [2024-07-23 17:10:48.399923] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1aa4970 name Existed_Raid, state offline 00:16:53.125 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:53.125 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:53.125 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:53.125 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4121421 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4121421 ']' 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4121421 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4121421 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4121421' 00:16:53.384 killing process with pid 4121421 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4121421 00:16:53.384 [2024-07-23 17:10:48.734816] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:53.384 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4121421 00:16:53.384 [2024-07-23 17:10:48.735729] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:53.643 17:10:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:53.643 00:16:53.643 real 0m10.691s 00:16:53.643 user 0m18.951s 00:16:53.643 sys 0m2.048s 00:16:53.643 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:53.643 17:10:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.643 ************************************ 00:16:53.643 END TEST raid_state_function_test 00:16:53.643 ************************************ 00:16:53.643 17:10:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:53.643 17:10:48 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:16:53.643 17:10:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:53.643 17:10:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:53.643 17:10:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:53.643 ************************************ 00:16:53.643 START TEST raid_state_function_test_sb 00:16:53.643 ************************************ 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:53.643 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4123034 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4123034' 00:16:53.644 Process raid pid: 4123034 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4123034 /var/tmp/spdk-raid.sock 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4123034 ']' 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:53.644 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:53.644 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:53.903 [2024-07-23 17:10:49.108794] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:16:53.903 [2024-07-23 17:10:49.108868] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:53.903 [2024-07-23 17:10:49.243225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:53.903 [2024-07-23 17:10:49.296714] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:54.161 [2024-07-23 17:10:49.359520] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:54.161 [2024-07-23 17:10:49.359557] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:54.757 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:54.757 17:10:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:54.757 17:10:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:54.757 [2024-07-23 17:10:50.134412] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:54.757 [2024-07-23 17:10:50.134455] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:54.757 [2024-07-23 17:10:50.134466] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:54.757 [2024-07-23 17:10:50.134477] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.757 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:55.031 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:55.031 "name": "Existed_Raid", 00:16:55.031 "uuid": "101273a5-c7ed-49a0-89d0-d5f452249158", 00:16:55.031 "strip_size_kb": 0, 00:16:55.031 "state": "configuring", 00:16:55.031 "raid_level": "raid1", 00:16:55.031 "superblock": true, 00:16:55.031 "num_base_bdevs": 2, 00:16:55.031 "num_base_bdevs_discovered": 0, 00:16:55.031 "num_base_bdevs_operational": 2, 00:16:55.031 "base_bdevs_list": [ 00:16:55.031 { 00:16:55.031 "name": "BaseBdev1", 00:16:55.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.031 "is_configured": false, 00:16:55.031 "data_offset": 0, 00:16:55.031 "data_size": 0 00:16:55.031 }, 00:16:55.031 { 00:16:55.031 "name": "BaseBdev2", 00:16:55.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.031 "is_configured": false, 00:16:55.031 "data_offset": 0, 00:16:55.031 "data_size": 0 00:16:55.031 } 00:16:55.031 ] 00:16:55.031 }' 00:16:55.031 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:55.031 17:10:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.596 17:10:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.854 [2024-07-23 17:10:51.164995] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.854 [2024-07-23 17:10:51.165029] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17643f0 name Existed_Raid, state configuring 00:16:55.854 17:10:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:56.112 [2024-07-23 17:10:51.417680] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:56.112 [2024-07-23 17:10:51.417710] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:56.112 [2024-07-23 17:10:51.417720] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:56.112 [2024-07-23 17:10:51.417732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:56.112 17:10:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:56.370 [2024-07-23 17:10:51.680319] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:56.370 BaseBdev1 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:56.370 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:56.628 17:10:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:56.885 [ 00:16:56.885 { 00:16:56.885 "name": "BaseBdev1", 00:16:56.885 "aliases": [ 00:16:56.885 "f423b792-96c6-47c1-9ad1-f7569ce313a1" 00:16:56.885 ], 00:16:56.885 "product_name": "Malloc disk", 00:16:56.885 "block_size": 512, 00:16:56.885 "num_blocks": 65536, 00:16:56.885 "uuid": "f423b792-96c6-47c1-9ad1-f7569ce313a1", 00:16:56.885 "assigned_rate_limits": { 00:16:56.885 "rw_ios_per_sec": 0, 00:16:56.885 "rw_mbytes_per_sec": 0, 00:16:56.885 "r_mbytes_per_sec": 0, 00:16:56.885 "w_mbytes_per_sec": 0 00:16:56.885 }, 00:16:56.885 "claimed": true, 00:16:56.885 "claim_type": "exclusive_write", 00:16:56.885 "zoned": false, 00:16:56.885 "supported_io_types": { 00:16:56.885 "read": true, 00:16:56.885 "write": true, 00:16:56.885 "unmap": true, 00:16:56.886 "flush": true, 00:16:56.886 "reset": true, 00:16:56.886 "nvme_admin": false, 00:16:56.886 "nvme_io": false, 00:16:56.886 "nvme_io_md": false, 00:16:56.886 "write_zeroes": true, 00:16:56.886 "zcopy": true, 00:16:56.886 "get_zone_info": false, 00:16:56.886 "zone_management": false, 00:16:56.886 "zone_append": false, 00:16:56.886 "compare": false, 00:16:56.886 "compare_and_write": false, 00:16:56.886 "abort": true, 00:16:56.886 "seek_hole": false, 00:16:56.886 "seek_data": false, 00:16:56.886 "copy": true, 00:16:56.886 "nvme_iov_md": false 00:16:56.886 }, 00:16:56.886 "memory_domains": [ 00:16:56.886 { 00:16:56.886 "dma_device_id": "system", 00:16:56.886 "dma_device_type": 1 00:16:56.886 }, 00:16:56.886 { 00:16:56.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.886 "dma_device_type": 2 00:16:56.886 } 00:16:56.886 ], 00:16:56.886 "driver_specific": {} 00:16:56.886 } 00:16:56.886 ] 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.886 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.144 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.144 "name": "Existed_Raid", 00:16:57.144 "uuid": "7d489257-124a-477a-98d3-eaf47a755b9d", 00:16:57.144 "strip_size_kb": 0, 00:16:57.144 "state": "configuring", 00:16:57.144 "raid_level": "raid1", 00:16:57.144 "superblock": true, 00:16:57.144 "num_base_bdevs": 2, 00:16:57.144 "num_base_bdevs_discovered": 1, 00:16:57.144 "num_base_bdevs_operational": 2, 00:16:57.144 "base_bdevs_list": [ 00:16:57.144 { 00:16:57.144 "name": "BaseBdev1", 00:16:57.144 "uuid": "f423b792-96c6-47c1-9ad1-f7569ce313a1", 00:16:57.144 "is_configured": true, 00:16:57.144 "data_offset": 2048, 00:16:57.144 "data_size": 63488 00:16:57.144 }, 00:16:57.144 { 00:16:57.144 "name": "BaseBdev2", 00:16:57.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.144 "is_configured": false, 00:16:57.144 "data_offset": 0, 00:16:57.144 "data_size": 0 00:16:57.144 } 00:16:57.144 ] 00:16:57.144 }' 00:16:57.144 17:10:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.144 17:10:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:57.710 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:57.968 [2024-07-23 17:10:53.276542] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:57.968 [2024-07-23 17:10:53.276585] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1763d20 name Existed_Raid, state configuring 00:16:57.969 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:16:58.226 [2024-07-23 17:10:53.525232] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:58.226 [2024-07-23 17:10:53.526697] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:58.226 [2024-07-23 17:10:53.526732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.226 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.484 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.484 "name": "Existed_Raid", 00:16:58.484 "uuid": "176b62ca-6281-4bd4-91c7-418cb397aba4", 00:16:58.484 "strip_size_kb": 0, 00:16:58.484 "state": "configuring", 00:16:58.484 "raid_level": "raid1", 00:16:58.484 "superblock": true, 00:16:58.484 "num_base_bdevs": 2, 00:16:58.484 "num_base_bdevs_discovered": 1, 00:16:58.484 "num_base_bdevs_operational": 2, 00:16:58.484 "base_bdevs_list": [ 00:16:58.484 { 00:16:58.484 "name": "BaseBdev1", 00:16:58.484 "uuid": "f423b792-96c6-47c1-9ad1-f7569ce313a1", 00:16:58.484 "is_configured": true, 00:16:58.484 "data_offset": 2048, 00:16:58.484 "data_size": 63488 00:16:58.484 }, 00:16:58.484 { 00:16:58.484 "name": "BaseBdev2", 00:16:58.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.484 "is_configured": false, 00:16:58.484 "data_offset": 0, 00:16:58.484 "data_size": 0 00:16:58.484 } 00:16:58.484 ] 00:16:58.484 }' 00:16:58.484 17:10:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.484 17:10:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:59.050 17:10:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:59.309 [2024-07-23 17:10:54.607503] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:59.309 [2024-07-23 17:10:54.607653] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1763970 00:16:59.309 [2024-07-23 17:10:54.607667] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:59.309 [2024-07-23 17:10:54.607836] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17630c0 00:16:59.309 [2024-07-23 17:10:54.607970] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1763970 00:16:59.309 [2024-07-23 17:10:54.607981] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1763970 00:16:59.309 [2024-07-23 17:10:54.608074] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:59.309 BaseBdev2 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:59.309 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.567 17:10:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:59.826 [ 00:16:59.826 { 00:16:59.826 "name": "BaseBdev2", 00:16:59.826 "aliases": [ 00:16:59.826 "3ab760cf-528c-4314-a064-9a7c0099949d" 00:16:59.826 ], 00:16:59.826 "product_name": "Malloc disk", 00:16:59.826 "block_size": 512, 00:16:59.826 "num_blocks": 65536, 00:16:59.826 "uuid": "3ab760cf-528c-4314-a064-9a7c0099949d", 00:16:59.826 "assigned_rate_limits": { 00:16:59.826 "rw_ios_per_sec": 0, 00:16:59.826 "rw_mbytes_per_sec": 0, 00:16:59.826 "r_mbytes_per_sec": 0, 00:16:59.826 "w_mbytes_per_sec": 0 00:16:59.826 }, 00:16:59.826 "claimed": true, 00:16:59.826 "claim_type": "exclusive_write", 00:16:59.826 "zoned": false, 00:16:59.826 "supported_io_types": { 00:16:59.826 "read": true, 00:16:59.826 "write": true, 00:16:59.826 "unmap": true, 00:16:59.826 "flush": true, 00:16:59.826 "reset": true, 00:16:59.826 "nvme_admin": false, 00:16:59.826 "nvme_io": false, 00:16:59.826 "nvme_io_md": false, 00:16:59.826 "write_zeroes": true, 00:16:59.826 "zcopy": true, 00:16:59.826 "get_zone_info": false, 00:16:59.826 "zone_management": false, 00:16:59.826 "zone_append": false, 00:16:59.826 "compare": false, 00:16:59.826 "compare_and_write": false, 00:16:59.826 "abort": true, 00:16:59.826 "seek_hole": false, 00:16:59.826 "seek_data": false, 00:16:59.826 "copy": true, 00:16:59.826 "nvme_iov_md": false 00:16:59.826 }, 00:16:59.826 "memory_domains": [ 00:16:59.826 { 00:16:59.826 "dma_device_id": "system", 00:16:59.826 "dma_device_type": 1 00:16:59.826 }, 00:16:59.826 { 00:16:59.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.826 "dma_device_type": 2 00:16:59.826 } 00:16:59.826 ], 00:16:59.826 "driver_specific": {} 00:16:59.826 } 00:16:59.826 ] 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.826 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.085 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.085 "name": "Existed_Raid", 00:17:00.085 "uuid": "176b62ca-6281-4bd4-91c7-418cb397aba4", 00:17:00.085 "strip_size_kb": 0, 00:17:00.085 "state": "online", 00:17:00.085 "raid_level": "raid1", 00:17:00.085 "superblock": true, 00:17:00.085 "num_base_bdevs": 2, 00:17:00.085 "num_base_bdevs_discovered": 2, 00:17:00.085 "num_base_bdevs_operational": 2, 00:17:00.085 "base_bdevs_list": [ 00:17:00.085 { 00:17:00.085 "name": "BaseBdev1", 00:17:00.085 "uuid": "f423b792-96c6-47c1-9ad1-f7569ce313a1", 00:17:00.085 "is_configured": true, 00:17:00.085 "data_offset": 2048, 00:17:00.085 "data_size": 63488 00:17:00.085 }, 00:17:00.085 { 00:17:00.085 "name": "BaseBdev2", 00:17:00.085 "uuid": "3ab760cf-528c-4314-a064-9a7c0099949d", 00:17:00.085 "is_configured": true, 00:17:00.085 "data_offset": 2048, 00:17:00.085 "data_size": 63488 00:17:00.085 } 00:17:00.085 ] 00:17:00.085 }' 00:17:00.085 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.085 17:10:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.651 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:00.651 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:00.651 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:00.651 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:00.651 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:00.651 17:10:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:00.651 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:00.651 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:00.910 [2024-07-23 17:10:56.228085] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:00.910 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:00.910 "name": "Existed_Raid", 00:17:00.910 "aliases": [ 00:17:00.910 "176b62ca-6281-4bd4-91c7-418cb397aba4" 00:17:00.910 ], 00:17:00.910 "product_name": "Raid Volume", 00:17:00.910 "block_size": 512, 00:17:00.910 "num_blocks": 63488, 00:17:00.910 "uuid": "176b62ca-6281-4bd4-91c7-418cb397aba4", 00:17:00.910 "assigned_rate_limits": { 00:17:00.910 "rw_ios_per_sec": 0, 00:17:00.910 "rw_mbytes_per_sec": 0, 00:17:00.910 "r_mbytes_per_sec": 0, 00:17:00.910 "w_mbytes_per_sec": 0 00:17:00.910 }, 00:17:00.910 "claimed": false, 00:17:00.910 "zoned": false, 00:17:00.910 "supported_io_types": { 00:17:00.910 "read": true, 00:17:00.910 "write": true, 00:17:00.910 "unmap": false, 00:17:00.910 "flush": false, 00:17:00.910 "reset": true, 00:17:00.910 "nvme_admin": false, 00:17:00.910 "nvme_io": false, 00:17:00.910 "nvme_io_md": false, 00:17:00.910 "write_zeroes": true, 00:17:00.910 "zcopy": false, 00:17:00.910 "get_zone_info": false, 00:17:00.910 "zone_management": false, 00:17:00.910 "zone_append": false, 00:17:00.910 "compare": false, 00:17:00.910 "compare_and_write": false, 00:17:00.910 "abort": false, 00:17:00.910 "seek_hole": false, 00:17:00.910 "seek_data": false, 00:17:00.910 "copy": false, 00:17:00.910 "nvme_iov_md": false 00:17:00.910 }, 00:17:00.910 "memory_domains": [ 00:17:00.910 { 00:17:00.910 "dma_device_id": "system", 00:17:00.910 "dma_device_type": 1 00:17:00.910 }, 00:17:00.910 { 00:17:00.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.910 "dma_device_type": 2 00:17:00.910 }, 00:17:00.910 { 00:17:00.910 "dma_device_id": "system", 00:17:00.910 "dma_device_type": 1 00:17:00.910 }, 00:17:00.910 { 00:17:00.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.910 "dma_device_type": 2 00:17:00.910 } 00:17:00.910 ], 00:17:00.910 "driver_specific": { 00:17:00.910 "raid": { 00:17:00.910 "uuid": "176b62ca-6281-4bd4-91c7-418cb397aba4", 00:17:00.910 "strip_size_kb": 0, 00:17:00.910 "state": "online", 00:17:00.910 "raid_level": "raid1", 00:17:00.910 "superblock": true, 00:17:00.910 "num_base_bdevs": 2, 00:17:00.910 "num_base_bdevs_discovered": 2, 00:17:00.910 "num_base_bdevs_operational": 2, 00:17:00.910 "base_bdevs_list": [ 00:17:00.910 { 00:17:00.910 "name": "BaseBdev1", 00:17:00.910 "uuid": "f423b792-96c6-47c1-9ad1-f7569ce313a1", 00:17:00.910 "is_configured": true, 00:17:00.910 "data_offset": 2048, 00:17:00.910 "data_size": 63488 00:17:00.910 }, 00:17:00.910 { 00:17:00.910 "name": "BaseBdev2", 00:17:00.910 "uuid": "3ab760cf-528c-4314-a064-9a7c0099949d", 00:17:00.910 "is_configured": true, 00:17:00.910 "data_offset": 2048, 00:17:00.910 "data_size": 63488 00:17:00.910 } 00:17:00.910 ] 00:17:00.910 } 00:17:00.910 } 00:17:00.910 }' 00:17:00.910 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:00.910 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:00.910 BaseBdev2' 00:17:00.910 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.910 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:00.910 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.169 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.169 "name": "BaseBdev1", 00:17:01.169 "aliases": [ 00:17:01.169 "f423b792-96c6-47c1-9ad1-f7569ce313a1" 00:17:01.169 ], 00:17:01.169 "product_name": "Malloc disk", 00:17:01.169 "block_size": 512, 00:17:01.169 "num_blocks": 65536, 00:17:01.169 "uuid": "f423b792-96c6-47c1-9ad1-f7569ce313a1", 00:17:01.169 "assigned_rate_limits": { 00:17:01.169 "rw_ios_per_sec": 0, 00:17:01.169 "rw_mbytes_per_sec": 0, 00:17:01.169 "r_mbytes_per_sec": 0, 00:17:01.169 "w_mbytes_per_sec": 0 00:17:01.169 }, 00:17:01.169 "claimed": true, 00:17:01.169 "claim_type": "exclusive_write", 00:17:01.169 "zoned": false, 00:17:01.169 "supported_io_types": { 00:17:01.169 "read": true, 00:17:01.169 "write": true, 00:17:01.169 "unmap": true, 00:17:01.169 "flush": true, 00:17:01.169 "reset": true, 00:17:01.169 "nvme_admin": false, 00:17:01.169 "nvme_io": false, 00:17:01.169 "nvme_io_md": false, 00:17:01.169 "write_zeroes": true, 00:17:01.169 "zcopy": true, 00:17:01.169 "get_zone_info": false, 00:17:01.169 "zone_management": false, 00:17:01.169 "zone_append": false, 00:17:01.169 "compare": false, 00:17:01.169 "compare_and_write": false, 00:17:01.169 "abort": true, 00:17:01.169 "seek_hole": false, 00:17:01.169 "seek_data": false, 00:17:01.169 "copy": true, 00:17:01.169 "nvme_iov_md": false 00:17:01.169 }, 00:17:01.169 "memory_domains": [ 00:17:01.169 { 00:17:01.169 "dma_device_id": "system", 00:17:01.169 "dma_device_type": 1 00:17:01.169 }, 00:17:01.169 { 00:17:01.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.169 "dma_device_type": 2 00:17:01.169 } 00:17:01.169 ], 00:17:01.169 "driver_specific": {} 00:17:01.169 }' 00:17:01.169 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.169 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.427 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.685 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.685 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:01.685 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:01.685 17:10:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.685 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.685 "name": "BaseBdev2", 00:17:01.685 "aliases": [ 00:17:01.685 "3ab760cf-528c-4314-a064-9a7c0099949d" 00:17:01.685 ], 00:17:01.685 "product_name": "Malloc disk", 00:17:01.685 "block_size": 512, 00:17:01.685 "num_blocks": 65536, 00:17:01.685 "uuid": "3ab760cf-528c-4314-a064-9a7c0099949d", 00:17:01.685 "assigned_rate_limits": { 00:17:01.685 "rw_ios_per_sec": 0, 00:17:01.685 "rw_mbytes_per_sec": 0, 00:17:01.685 "r_mbytes_per_sec": 0, 00:17:01.685 "w_mbytes_per_sec": 0 00:17:01.685 }, 00:17:01.685 "claimed": true, 00:17:01.685 "claim_type": "exclusive_write", 00:17:01.685 "zoned": false, 00:17:01.685 "supported_io_types": { 00:17:01.685 "read": true, 00:17:01.685 "write": true, 00:17:01.685 "unmap": true, 00:17:01.685 "flush": true, 00:17:01.685 "reset": true, 00:17:01.685 "nvme_admin": false, 00:17:01.685 "nvme_io": false, 00:17:01.685 "nvme_io_md": false, 00:17:01.685 "write_zeroes": true, 00:17:01.685 "zcopy": true, 00:17:01.685 "get_zone_info": false, 00:17:01.685 "zone_management": false, 00:17:01.685 "zone_append": false, 00:17:01.685 "compare": false, 00:17:01.685 "compare_and_write": false, 00:17:01.685 "abort": true, 00:17:01.685 "seek_hole": false, 00:17:01.685 "seek_data": false, 00:17:01.685 "copy": true, 00:17:01.685 "nvme_iov_md": false 00:17:01.685 }, 00:17:01.685 "memory_domains": [ 00:17:01.685 { 00:17:01.685 "dma_device_id": "system", 00:17:01.685 "dma_device_type": 1 00:17:01.685 }, 00:17:01.685 { 00:17:01.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.685 "dma_device_type": 2 00:17:01.685 } 00:17:01.685 ], 00:17:01.685 "driver_specific": {} 00:17:01.685 }' 00:17:01.685 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.942 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.942 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.942 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.942 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.943 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.943 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.943 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.200 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.200 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.200 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.200 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.200 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:02.459 [2024-07-23 17:10:57.695786] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.459 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.717 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.717 "name": "Existed_Raid", 00:17:02.717 "uuid": "176b62ca-6281-4bd4-91c7-418cb397aba4", 00:17:02.717 "strip_size_kb": 0, 00:17:02.717 "state": "online", 00:17:02.717 "raid_level": "raid1", 00:17:02.717 "superblock": true, 00:17:02.717 "num_base_bdevs": 2, 00:17:02.717 "num_base_bdevs_discovered": 1, 00:17:02.717 "num_base_bdevs_operational": 1, 00:17:02.717 "base_bdevs_list": [ 00:17:02.717 { 00:17:02.717 "name": null, 00:17:02.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.717 "is_configured": false, 00:17:02.717 "data_offset": 2048, 00:17:02.717 "data_size": 63488 00:17:02.717 }, 00:17:02.717 { 00:17:02.717 "name": "BaseBdev2", 00:17:02.717 "uuid": "3ab760cf-528c-4314-a064-9a7c0099949d", 00:17:02.717 "is_configured": true, 00:17:02.717 "data_offset": 2048, 00:17:02.717 "data_size": 63488 00:17:02.717 } 00:17:02.717 ] 00:17:02.717 }' 00:17:02.717 17:10:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.717 17:10:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:03.282 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:03.282 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:03.282 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.282 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:03.540 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:03.540 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:03.540 17:10:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:03.798 [2024-07-23 17:10:58.984388] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:03.798 [2024-07-23 17:10:58.984471] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:03.799 [2024-07-23 17:10:58.995338] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:03.799 [2024-07-23 17:10:58.995375] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:03.799 [2024-07-23 17:10:58.995387] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1763970 name Existed_Raid, state offline 00:17:03.799 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:03.799 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:03.799 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.799 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4123034 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4123034 ']' 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4123034 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4123034 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4123034' 00:17:04.057 killing process with pid 4123034 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4123034 00:17:04.057 [2024-07-23 17:10:59.327724] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:04.057 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4123034 00:17:04.057 [2024-07-23 17:10:59.328617] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:04.316 17:10:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:04.316 00:17:04.316 real 0m10.497s 00:17:04.316 user 0m18.639s 00:17:04.316 sys 0m1.999s 00:17:04.316 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:04.316 17:10:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:04.316 ************************************ 00:17:04.316 END TEST raid_state_function_test_sb 00:17:04.316 ************************************ 00:17:04.316 17:10:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:04.316 17:10:59 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:17:04.316 17:10:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:04.316 17:10:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:04.317 17:10:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:04.317 ************************************ 00:17:04.317 START TEST raid_superblock_test 00:17:04.317 ************************************ 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4124541 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4124541 /var/tmp/spdk-raid.sock 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4124541 ']' 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:04.317 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:04.317 17:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.317 [2024-07-23 17:10:59.690081] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:17:04.317 [2024-07-23 17:10:59.690158] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4124541 ] 00:17:04.576 [2024-07-23 17:10:59.826020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:04.576 [2024-07-23 17:10:59.880549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:04.576 [2024-07-23 17:10:59.942722] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:04.576 [2024-07-23 17:10:59.942761] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:05.511 malloc1 00:17:05.511 17:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:05.776 [2024-07-23 17:11:01.112540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:05.776 [2024-07-23 17:11:01.112589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.776 [2024-07-23 17:11:01.112609] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2646070 00:17:05.776 [2024-07-23 17:11:01.112621] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.776 [2024-07-23 17:11:01.114101] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.776 [2024-07-23 17:11:01.114127] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:05.776 pt1 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:05.776 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:06.034 malloc2 00:17:06.034 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:06.292 [2024-07-23 17:11:01.614577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:06.292 [2024-07-23 17:11:01.614627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.292 [2024-07-23 17:11:01.614645] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252c920 00:17:06.292 [2024-07-23 17:11:01.614657] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.292 [2024-07-23 17:11:01.616189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.292 [2024-07-23 17:11:01.616215] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:06.292 pt2 00:17:06.292 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:06.292 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:06.293 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:17:06.551 [2024-07-23 17:11:01.855218] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:06.551 [2024-07-23 17:11:01.856347] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:06.551 [2024-07-23 17:11:01.856476] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x263e3e0 00:17:06.551 [2024-07-23 17:11:01.856489] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:06.551 [2024-07-23 17:11:01.856658] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x263f280 00:17:06.551 [2024-07-23 17:11:01.856794] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x263e3e0 00:17:06.551 [2024-07-23 17:11:01.856804] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x263e3e0 00:17:06.551 [2024-07-23 17:11:01.856902] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.551 17:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.809 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.809 "name": "raid_bdev1", 00:17:06.809 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:06.809 "strip_size_kb": 0, 00:17:06.809 "state": "online", 00:17:06.809 "raid_level": "raid1", 00:17:06.809 "superblock": true, 00:17:06.809 "num_base_bdevs": 2, 00:17:06.809 "num_base_bdevs_discovered": 2, 00:17:06.809 "num_base_bdevs_operational": 2, 00:17:06.809 "base_bdevs_list": [ 00:17:06.809 { 00:17:06.809 "name": "pt1", 00:17:06.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:06.809 "is_configured": true, 00:17:06.809 "data_offset": 2048, 00:17:06.809 "data_size": 63488 00:17:06.809 }, 00:17:06.809 { 00:17:06.809 "name": "pt2", 00:17:06.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.809 "is_configured": true, 00:17:06.809 "data_offset": 2048, 00:17:06.810 "data_size": 63488 00:17:06.810 } 00:17:06.810 ] 00:17:06.810 }' 00:17:06.810 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.810 17:11:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:07.376 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:07.635 [2024-07-23 17:11:02.962361] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:07.635 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:07.635 "name": "raid_bdev1", 00:17:07.635 "aliases": [ 00:17:07.635 "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0" 00:17:07.635 ], 00:17:07.635 "product_name": "Raid Volume", 00:17:07.635 "block_size": 512, 00:17:07.635 "num_blocks": 63488, 00:17:07.635 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:07.635 "assigned_rate_limits": { 00:17:07.635 "rw_ios_per_sec": 0, 00:17:07.635 "rw_mbytes_per_sec": 0, 00:17:07.635 "r_mbytes_per_sec": 0, 00:17:07.635 "w_mbytes_per_sec": 0 00:17:07.635 }, 00:17:07.635 "claimed": false, 00:17:07.635 "zoned": false, 00:17:07.635 "supported_io_types": { 00:17:07.635 "read": true, 00:17:07.635 "write": true, 00:17:07.635 "unmap": false, 00:17:07.635 "flush": false, 00:17:07.635 "reset": true, 00:17:07.635 "nvme_admin": false, 00:17:07.635 "nvme_io": false, 00:17:07.635 "nvme_io_md": false, 00:17:07.635 "write_zeroes": true, 00:17:07.635 "zcopy": false, 00:17:07.635 "get_zone_info": false, 00:17:07.635 "zone_management": false, 00:17:07.635 "zone_append": false, 00:17:07.635 "compare": false, 00:17:07.635 "compare_and_write": false, 00:17:07.635 "abort": false, 00:17:07.635 "seek_hole": false, 00:17:07.635 "seek_data": false, 00:17:07.635 "copy": false, 00:17:07.635 "nvme_iov_md": false 00:17:07.635 }, 00:17:07.635 "memory_domains": [ 00:17:07.635 { 00:17:07.635 "dma_device_id": "system", 00:17:07.635 "dma_device_type": 1 00:17:07.635 }, 00:17:07.635 { 00:17:07.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.635 "dma_device_type": 2 00:17:07.635 }, 00:17:07.635 { 00:17:07.635 "dma_device_id": "system", 00:17:07.635 "dma_device_type": 1 00:17:07.635 }, 00:17:07.635 { 00:17:07.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.635 "dma_device_type": 2 00:17:07.635 } 00:17:07.635 ], 00:17:07.635 "driver_specific": { 00:17:07.635 "raid": { 00:17:07.635 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:07.635 "strip_size_kb": 0, 00:17:07.635 "state": "online", 00:17:07.635 "raid_level": "raid1", 00:17:07.635 "superblock": true, 00:17:07.635 "num_base_bdevs": 2, 00:17:07.635 "num_base_bdevs_discovered": 2, 00:17:07.635 "num_base_bdevs_operational": 2, 00:17:07.635 "base_bdevs_list": [ 00:17:07.635 { 00:17:07.635 "name": "pt1", 00:17:07.635 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.635 "is_configured": true, 00:17:07.635 "data_offset": 2048, 00:17:07.635 "data_size": 63488 00:17:07.635 }, 00:17:07.635 { 00:17:07.635 "name": "pt2", 00:17:07.635 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.635 "is_configured": true, 00:17:07.635 "data_offset": 2048, 00:17:07.635 "data_size": 63488 00:17:07.635 } 00:17:07.635 ] 00:17:07.635 } 00:17:07.635 } 00:17:07.635 }' 00:17:07.635 17:11:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:07.635 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:07.635 pt2' 00:17:07.635 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.635 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:07.635 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.894 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.894 "name": "pt1", 00:17:07.894 "aliases": [ 00:17:07.894 "00000000-0000-0000-0000-000000000001" 00:17:07.894 ], 00:17:07.894 "product_name": "passthru", 00:17:07.894 "block_size": 512, 00:17:07.894 "num_blocks": 65536, 00:17:07.894 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.894 "assigned_rate_limits": { 00:17:07.894 "rw_ios_per_sec": 0, 00:17:07.894 "rw_mbytes_per_sec": 0, 00:17:07.894 "r_mbytes_per_sec": 0, 00:17:07.894 "w_mbytes_per_sec": 0 00:17:07.894 }, 00:17:07.894 "claimed": true, 00:17:07.894 "claim_type": "exclusive_write", 00:17:07.894 "zoned": false, 00:17:07.894 "supported_io_types": { 00:17:07.894 "read": true, 00:17:07.894 "write": true, 00:17:07.894 "unmap": true, 00:17:07.894 "flush": true, 00:17:07.894 "reset": true, 00:17:07.894 "nvme_admin": false, 00:17:07.894 "nvme_io": false, 00:17:07.894 "nvme_io_md": false, 00:17:07.894 "write_zeroes": true, 00:17:07.894 "zcopy": true, 00:17:07.894 "get_zone_info": false, 00:17:07.894 "zone_management": false, 00:17:07.894 "zone_append": false, 00:17:07.894 "compare": false, 00:17:07.894 "compare_and_write": false, 00:17:07.894 "abort": true, 00:17:07.894 "seek_hole": false, 00:17:07.894 "seek_data": false, 00:17:07.894 "copy": true, 00:17:07.894 "nvme_iov_md": false 00:17:07.894 }, 00:17:07.894 "memory_domains": [ 00:17:07.894 { 00:17:07.894 "dma_device_id": "system", 00:17:07.894 "dma_device_type": 1 00:17:07.894 }, 00:17:07.894 { 00:17:07.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.894 "dma_device_type": 2 00:17:07.894 } 00:17:07.894 ], 00:17:07.894 "driver_specific": { 00:17:07.894 "passthru": { 00:17:07.894 "name": "pt1", 00:17:07.894 "base_bdev_name": "malloc1" 00:17:07.894 } 00:17:07.894 } 00:17:07.894 }' 00:17:07.894 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.154 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.154 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.154 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.154 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.154 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.154 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:08.413 17:11:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.673 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.673 "name": "pt2", 00:17:08.673 "aliases": [ 00:17:08.673 "00000000-0000-0000-0000-000000000002" 00:17:08.673 ], 00:17:08.673 "product_name": "passthru", 00:17:08.673 "block_size": 512, 00:17:08.673 "num_blocks": 65536, 00:17:08.673 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:08.673 "assigned_rate_limits": { 00:17:08.673 "rw_ios_per_sec": 0, 00:17:08.673 "rw_mbytes_per_sec": 0, 00:17:08.673 "r_mbytes_per_sec": 0, 00:17:08.673 "w_mbytes_per_sec": 0 00:17:08.673 }, 00:17:08.673 "claimed": true, 00:17:08.673 "claim_type": "exclusive_write", 00:17:08.673 "zoned": false, 00:17:08.673 "supported_io_types": { 00:17:08.673 "read": true, 00:17:08.673 "write": true, 00:17:08.673 "unmap": true, 00:17:08.673 "flush": true, 00:17:08.673 "reset": true, 00:17:08.673 "nvme_admin": false, 00:17:08.673 "nvme_io": false, 00:17:08.673 "nvme_io_md": false, 00:17:08.673 "write_zeroes": true, 00:17:08.673 "zcopy": true, 00:17:08.673 "get_zone_info": false, 00:17:08.673 "zone_management": false, 00:17:08.673 "zone_append": false, 00:17:08.673 "compare": false, 00:17:08.673 "compare_and_write": false, 00:17:08.673 "abort": true, 00:17:08.673 "seek_hole": false, 00:17:08.673 "seek_data": false, 00:17:08.673 "copy": true, 00:17:08.673 "nvme_iov_md": false 00:17:08.673 }, 00:17:08.673 "memory_domains": [ 00:17:08.673 { 00:17:08.673 "dma_device_id": "system", 00:17:08.673 "dma_device_type": 1 00:17:08.673 }, 00:17:08.673 { 00:17:08.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.673 "dma_device_type": 2 00:17:08.673 } 00:17:08.673 ], 00:17:08.673 "driver_specific": { 00:17:08.673 "passthru": { 00:17:08.673 "name": "pt2", 00:17:08.673 "base_bdev_name": "malloc2" 00:17:08.674 } 00:17:08.674 } 00:17:08.674 }' 00:17:08.674 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.674 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.974 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:09.233 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:09.233 [2024-07-23 17:11:04.618744] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:09.233 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0 00:17:09.233 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0 ']' 00:17:09.233 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:09.492 [2024-07-23 17:11:04.798994] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:09.492 [2024-07-23 17:11:04.799023] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:09.492 [2024-07-23 17:11:04.799082] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:09.492 [2024-07-23 17:11:04.799134] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:09.492 [2024-07-23 17:11:04.799145] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x263e3e0 name raid_bdev1, state offline 00:17:09.492 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.492 17:11:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:09.751 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:09.751 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:09.751 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:09.751 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:10.010 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:10.010 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:10.269 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:10.269 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:10.528 17:11:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:17:10.788 [2024-07-23 17:11:06.046246] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:10.788 [2024-07-23 17:11:06.047624] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:10.788 [2024-07-23 17:11:06.047698] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:10.788 [2024-07-23 17:11:06.047739] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:10.788 [2024-07-23 17:11:06.047759] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:10.788 [2024-07-23 17:11:06.047768] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252c090 name raid_bdev1, state configuring 00:17:10.788 request: 00:17:10.788 { 00:17:10.788 "name": "raid_bdev1", 00:17:10.788 "raid_level": "raid1", 00:17:10.788 "base_bdevs": [ 00:17:10.788 "malloc1", 00:17:10.788 "malloc2" 00:17:10.788 ], 00:17:10.788 "superblock": false, 00:17:10.788 "method": "bdev_raid_create", 00:17:10.788 "req_id": 1 00:17:10.788 } 00:17:10.788 Got JSON-RPC error response 00:17:10.788 response: 00:17:10.788 { 00:17:10.788 "code": -17, 00:17:10.788 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:10.788 } 00:17:10.788 17:11:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:10.788 17:11:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:10.788 17:11:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:10.788 17:11:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:10.788 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.788 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:11.047 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:11.047 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:11.047 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:11.307 [2024-07-23 17:11:06.543507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:11.307 [2024-07-23 17:11:06.543547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.307 [2024-07-23 17:11:06.543564] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26471a0 00:17:11.307 [2024-07-23 17:11:06.543576] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.307 [2024-07-23 17:11:06.545141] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.307 [2024-07-23 17:11:06.545171] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:11.307 [2024-07-23 17:11:06.545234] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:11.307 [2024-07-23 17:11:06.545260] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:11.307 pt1 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.307 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:11.566 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.566 "name": "raid_bdev1", 00:17:11.566 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:11.566 "strip_size_kb": 0, 00:17:11.566 "state": "configuring", 00:17:11.566 "raid_level": "raid1", 00:17:11.566 "superblock": true, 00:17:11.566 "num_base_bdevs": 2, 00:17:11.566 "num_base_bdevs_discovered": 1, 00:17:11.566 "num_base_bdevs_operational": 2, 00:17:11.566 "base_bdevs_list": [ 00:17:11.566 { 00:17:11.566 "name": "pt1", 00:17:11.566 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:11.566 "is_configured": true, 00:17:11.566 "data_offset": 2048, 00:17:11.566 "data_size": 63488 00:17:11.566 }, 00:17:11.566 { 00:17:11.566 "name": null, 00:17:11.566 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:11.566 "is_configured": false, 00:17:11.566 "data_offset": 2048, 00:17:11.566 "data_size": 63488 00:17:11.566 } 00:17:11.566 ] 00:17:11.566 }' 00:17:11.566 17:11:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.566 17:11:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.134 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:17:12.134 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:12.134 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:12.134 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:12.393 [2024-07-23 17:11:07.570236] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:12.393 [2024-07-23 17:11:07.570285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.393 [2024-07-23 17:11:07.570304] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24958a0 00:17:12.393 [2024-07-23 17:11:07.570316] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.393 [2024-07-23 17:11:07.570657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.393 [2024-07-23 17:11:07.570675] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:12.393 [2024-07-23 17:11:07.570736] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:12.393 [2024-07-23 17:11:07.570754] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:12.393 [2024-07-23 17:11:07.570852] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2495350 00:17:12.393 [2024-07-23 17:11:07.570863] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:12.393 [2024-07-23 17:11:07.571035] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2492f80 00:17:12.393 [2024-07-23 17:11:07.571164] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2495350 00:17:12.393 [2024-07-23 17:11:07.571174] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2495350 00:17:12.393 [2024-07-23 17:11:07.571270] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:12.393 pt2 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:12.393 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.394 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.394 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.394 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.394 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.394 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:12.653 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.653 "name": "raid_bdev1", 00:17:12.653 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:12.653 "strip_size_kb": 0, 00:17:12.653 "state": "online", 00:17:12.653 "raid_level": "raid1", 00:17:12.653 "superblock": true, 00:17:12.653 "num_base_bdevs": 2, 00:17:12.653 "num_base_bdevs_discovered": 2, 00:17:12.653 "num_base_bdevs_operational": 2, 00:17:12.653 "base_bdevs_list": [ 00:17:12.653 { 00:17:12.653 "name": "pt1", 00:17:12.653 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:12.653 "is_configured": true, 00:17:12.653 "data_offset": 2048, 00:17:12.653 "data_size": 63488 00:17:12.653 }, 00:17:12.653 { 00:17:12.653 "name": "pt2", 00:17:12.653 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:12.653 "is_configured": true, 00:17:12.653 "data_offset": 2048, 00:17:12.653 "data_size": 63488 00:17:12.653 } 00:17:12.653 ] 00:17:12.653 }' 00:17:12.653 17:11:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.653 17:11:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:13.222 [2024-07-23 17:11:08.561116] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:13.222 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:13.222 "name": "raid_bdev1", 00:17:13.222 "aliases": [ 00:17:13.222 "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0" 00:17:13.222 ], 00:17:13.222 "product_name": "Raid Volume", 00:17:13.222 "block_size": 512, 00:17:13.222 "num_blocks": 63488, 00:17:13.222 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:13.222 "assigned_rate_limits": { 00:17:13.222 "rw_ios_per_sec": 0, 00:17:13.222 "rw_mbytes_per_sec": 0, 00:17:13.222 "r_mbytes_per_sec": 0, 00:17:13.222 "w_mbytes_per_sec": 0 00:17:13.222 }, 00:17:13.222 "claimed": false, 00:17:13.222 "zoned": false, 00:17:13.222 "supported_io_types": { 00:17:13.222 "read": true, 00:17:13.223 "write": true, 00:17:13.223 "unmap": false, 00:17:13.223 "flush": false, 00:17:13.223 "reset": true, 00:17:13.223 "nvme_admin": false, 00:17:13.223 "nvme_io": false, 00:17:13.223 "nvme_io_md": false, 00:17:13.223 "write_zeroes": true, 00:17:13.223 "zcopy": false, 00:17:13.223 "get_zone_info": false, 00:17:13.223 "zone_management": false, 00:17:13.223 "zone_append": false, 00:17:13.223 "compare": false, 00:17:13.223 "compare_and_write": false, 00:17:13.223 "abort": false, 00:17:13.223 "seek_hole": false, 00:17:13.223 "seek_data": false, 00:17:13.223 "copy": false, 00:17:13.223 "nvme_iov_md": false 00:17:13.223 }, 00:17:13.223 "memory_domains": [ 00:17:13.223 { 00:17:13.223 "dma_device_id": "system", 00:17:13.223 "dma_device_type": 1 00:17:13.223 }, 00:17:13.223 { 00:17:13.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.223 "dma_device_type": 2 00:17:13.223 }, 00:17:13.223 { 00:17:13.223 "dma_device_id": "system", 00:17:13.223 "dma_device_type": 1 00:17:13.223 }, 00:17:13.223 { 00:17:13.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.223 "dma_device_type": 2 00:17:13.223 } 00:17:13.223 ], 00:17:13.223 "driver_specific": { 00:17:13.223 "raid": { 00:17:13.223 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:13.223 "strip_size_kb": 0, 00:17:13.223 "state": "online", 00:17:13.223 "raid_level": "raid1", 00:17:13.223 "superblock": true, 00:17:13.223 "num_base_bdevs": 2, 00:17:13.223 "num_base_bdevs_discovered": 2, 00:17:13.223 "num_base_bdevs_operational": 2, 00:17:13.223 "base_bdevs_list": [ 00:17:13.223 { 00:17:13.223 "name": "pt1", 00:17:13.223 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:13.223 "is_configured": true, 00:17:13.223 "data_offset": 2048, 00:17:13.223 "data_size": 63488 00:17:13.223 }, 00:17:13.223 { 00:17:13.223 "name": "pt2", 00:17:13.223 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:13.223 "is_configured": true, 00:17:13.223 "data_offset": 2048, 00:17:13.223 "data_size": 63488 00:17:13.223 } 00:17:13.223 ] 00:17:13.223 } 00:17:13.223 } 00:17:13.223 }' 00:17:13.223 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:13.223 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:13.223 pt2' 00:17:13.223 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.223 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:13.223 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:13.482 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:13.482 "name": "pt1", 00:17:13.482 "aliases": [ 00:17:13.482 "00000000-0000-0000-0000-000000000001" 00:17:13.482 ], 00:17:13.482 "product_name": "passthru", 00:17:13.482 "block_size": 512, 00:17:13.482 "num_blocks": 65536, 00:17:13.482 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:13.482 "assigned_rate_limits": { 00:17:13.482 "rw_ios_per_sec": 0, 00:17:13.482 "rw_mbytes_per_sec": 0, 00:17:13.482 "r_mbytes_per_sec": 0, 00:17:13.482 "w_mbytes_per_sec": 0 00:17:13.482 }, 00:17:13.482 "claimed": true, 00:17:13.482 "claim_type": "exclusive_write", 00:17:13.482 "zoned": false, 00:17:13.482 "supported_io_types": { 00:17:13.482 "read": true, 00:17:13.482 "write": true, 00:17:13.482 "unmap": true, 00:17:13.482 "flush": true, 00:17:13.482 "reset": true, 00:17:13.482 "nvme_admin": false, 00:17:13.482 "nvme_io": false, 00:17:13.482 "nvme_io_md": false, 00:17:13.482 "write_zeroes": true, 00:17:13.482 "zcopy": true, 00:17:13.482 "get_zone_info": false, 00:17:13.482 "zone_management": false, 00:17:13.482 "zone_append": false, 00:17:13.482 "compare": false, 00:17:13.482 "compare_and_write": false, 00:17:13.482 "abort": true, 00:17:13.482 "seek_hole": false, 00:17:13.482 "seek_data": false, 00:17:13.482 "copy": true, 00:17:13.482 "nvme_iov_md": false 00:17:13.482 }, 00:17:13.482 "memory_domains": [ 00:17:13.482 { 00:17:13.482 "dma_device_id": "system", 00:17:13.482 "dma_device_type": 1 00:17:13.482 }, 00:17:13.482 { 00:17:13.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.482 "dma_device_type": 2 00:17:13.482 } 00:17:13.482 ], 00:17:13.482 "driver_specific": { 00:17:13.482 "passthru": { 00:17:13.482 "name": "pt1", 00:17:13.482 "base_bdev_name": "malloc1" 00:17:13.482 } 00:17:13.482 } 00:17:13.482 }' 00:17:13.482 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.482 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.482 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:13.482 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.741 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.741 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:13.741 17:11:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:13.741 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:14.000 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:14.000 "name": "pt2", 00:17:14.000 "aliases": [ 00:17:14.000 "00000000-0000-0000-0000-000000000002" 00:17:14.000 ], 00:17:14.000 "product_name": "passthru", 00:17:14.000 "block_size": 512, 00:17:14.000 "num_blocks": 65536, 00:17:14.000 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:14.000 "assigned_rate_limits": { 00:17:14.000 "rw_ios_per_sec": 0, 00:17:14.000 "rw_mbytes_per_sec": 0, 00:17:14.000 "r_mbytes_per_sec": 0, 00:17:14.000 "w_mbytes_per_sec": 0 00:17:14.000 }, 00:17:14.000 "claimed": true, 00:17:14.000 "claim_type": "exclusive_write", 00:17:14.000 "zoned": false, 00:17:14.000 "supported_io_types": { 00:17:14.000 "read": true, 00:17:14.000 "write": true, 00:17:14.000 "unmap": true, 00:17:14.000 "flush": true, 00:17:14.000 "reset": true, 00:17:14.000 "nvme_admin": false, 00:17:14.000 "nvme_io": false, 00:17:14.000 "nvme_io_md": false, 00:17:14.000 "write_zeroes": true, 00:17:14.000 "zcopy": true, 00:17:14.000 "get_zone_info": false, 00:17:14.000 "zone_management": false, 00:17:14.000 "zone_append": false, 00:17:14.000 "compare": false, 00:17:14.000 "compare_and_write": false, 00:17:14.000 "abort": true, 00:17:14.000 "seek_hole": false, 00:17:14.000 "seek_data": false, 00:17:14.000 "copy": true, 00:17:14.000 "nvme_iov_md": false 00:17:14.000 }, 00:17:14.001 "memory_domains": [ 00:17:14.001 { 00:17:14.001 "dma_device_id": "system", 00:17:14.001 "dma_device_type": 1 00:17:14.001 }, 00:17:14.001 { 00:17:14.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.001 "dma_device_type": 2 00:17:14.001 } 00:17:14.001 ], 00:17:14.001 "driver_specific": { 00:17:14.001 "passthru": { 00:17:14.001 "name": "pt2", 00:17:14.001 "base_bdev_name": "malloc2" 00:17:14.001 } 00:17:14.001 } 00:17:14.001 }' 00:17:14.001 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.001 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:14.260 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:14.520 [2024-07-23 17:11:09.812409] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:14.520 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0 '!=' d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0 ']' 00:17:14.520 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:14.520 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:14.520 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:14.520 17:11:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:14.779 [2024-07-23 17:11:10.068876] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.779 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.780 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.780 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.348 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.348 "name": "raid_bdev1", 00:17:15.348 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:15.348 "strip_size_kb": 0, 00:17:15.348 "state": "online", 00:17:15.348 "raid_level": "raid1", 00:17:15.348 "superblock": true, 00:17:15.348 "num_base_bdevs": 2, 00:17:15.348 "num_base_bdevs_discovered": 1, 00:17:15.348 "num_base_bdevs_operational": 1, 00:17:15.348 "base_bdevs_list": [ 00:17:15.348 { 00:17:15.348 "name": null, 00:17:15.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.348 "is_configured": false, 00:17:15.348 "data_offset": 2048, 00:17:15.348 "data_size": 63488 00:17:15.348 }, 00:17:15.348 { 00:17:15.348 "name": "pt2", 00:17:15.348 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:15.348 "is_configured": true, 00:17:15.348 "data_offset": 2048, 00:17:15.348 "data_size": 63488 00:17:15.348 } 00:17:15.348 ] 00:17:15.348 }' 00:17:15.348 17:11:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.348 17:11:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.917 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:16.177 [2024-07-23 17:11:11.360292] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:16.177 [2024-07-23 17:11:11.360322] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:16.177 [2024-07-23 17:11:11.360377] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:16.177 [2024-07-23 17:11:11.360421] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:16.177 [2024-07-23 17:11:11.360432] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2495350 name raid_bdev1, state offline 00:17:16.177 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.177 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:16.436 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:16.436 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:16.436 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:16.436 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:16.436 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:16.696 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:16.696 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:16.696 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:16.696 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:16.696 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:17:16.696 17:11:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:16.696 [2024-07-23 17:11:12.102226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:16.696 [2024-07-23 17:11:12.102275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:16.696 [2024-07-23 17:11:12.102295] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2647850 00:17:16.696 [2024-07-23 17:11:12.102307] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:16.696 [2024-07-23 17:11:12.103903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:16.696 [2024-07-23 17:11:12.103931] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:16.696 [2024-07-23 17:11:12.103999] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:16.696 [2024-07-23 17:11:12.104022] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:16.696 [2024-07-23 17:11:12.104108] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2640990 00:17:16.696 [2024-07-23 17:11:12.104118] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:16.696 [2024-07-23 17:11:12.104287] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2646300 00:17:16.696 [2024-07-23 17:11:12.104406] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2640990 00:17:16.696 [2024-07-23 17:11:12.104416] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2640990 00:17:16.696 [2024-07-23 17:11:12.104513] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.696 pt2 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.956 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:17.216 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.216 "name": "raid_bdev1", 00:17:17.216 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:17.216 "strip_size_kb": 0, 00:17:17.216 "state": "online", 00:17:17.216 "raid_level": "raid1", 00:17:17.216 "superblock": true, 00:17:17.216 "num_base_bdevs": 2, 00:17:17.216 "num_base_bdevs_discovered": 1, 00:17:17.216 "num_base_bdevs_operational": 1, 00:17:17.216 "base_bdevs_list": [ 00:17:17.216 { 00:17:17.216 "name": null, 00:17:17.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.216 "is_configured": false, 00:17:17.216 "data_offset": 2048, 00:17:17.216 "data_size": 63488 00:17:17.216 }, 00:17:17.216 { 00:17:17.216 "name": "pt2", 00:17:17.216 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:17.216 "is_configured": true, 00:17:17.216 "data_offset": 2048, 00:17:17.216 "data_size": 63488 00:17:17.216 } 00:17:17.216 ] 00:17:17.216 }' 00:17:17.216 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.216 17:11:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.785 17:11:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:17.785 [2024-07-23 17:11:13.132950] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:17.785 [2024-07-23 17:11:13.132979] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:17.785 [2024-07-23 17:11:13.133026] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:17.785 [2024-07-23 17:11:13.133067] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:17.785 [2024-07-23 17:11:13.133078] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2640990 name raid_bdev1, state offline 00:17:17.785 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.785 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:18.044 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:18.044 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:18.044 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:17:18.044 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:18.302 [2024-07-23 17:11:13.642293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:18.302 [2024-07-23 17:11:13.642334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.302 [2024-07-23 17:11:13.642352] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2494a80 00:17:18.302 [2024-07-23 17:11:13.642364] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.302 [2024-07-23 17:11:13.643934] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.302 [2024-07-23 17:11:13.643961] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:18.302 [2024-07-23 17:11:13.644025] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:18.302 [2024-07-23 17:11:13.644048] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:18.302 [2024-07-23 17:11:13.644145] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:18.302 [2024-07-23 17:11:13.644158] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:18.302 [2024-07-23 17:11:13.644170] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2644690 name raid_bdev1, state configuring 00:17:18.302 [2024-07-23 17:11:13.644191] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:18.302 [2024-07-23 17:11:13.644247] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x263ff70 00:17:18.302 [2024-07-23 17:11:13.644257] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:18.302 [2024-07-23 17:11:13.644420] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26458d0 00:17:18.302 [2024-07-23 17:11:13.644539] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x263ff70 00:17:18.302 [2024-07-23 17:11:13.644548] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x263ff70 00:17:18.302 [2024-07-23 17:11:13.644640] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:18.302 pt1 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.302 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:18.303 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.303 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.303 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.303 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.303 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.303 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:18.562 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.562 "name": "raid_bdev1", 00:17:18.562 "uuid": "d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0", 00:17:18.562 "strip_size_kb": 0, 00:17:18.562 "state": "online", 00:17:18.562 "raid_level": "raid1", 00:17:18.562 "superblock": true, 00:17:18.562 "num_base_bdevs": 2, 00:17:18.562 "num_base_bdevs_discovered": 1, 00:17:18.562 "num_base_bdevs_operational": 1, 00:17:18.562 "base_bdevs_list": [ 00:17:18.562 { 00:17:18.562 "name": null, 00:17:18.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.562 "is_configured": false, 00:17:18.562 "data_offset": 2048, 00:17:18.562 "data_size": 63488 00:17:18.562 }, 00:17:18.562 { 00:17:18.562 "name": "pt2", 00:17:18.562 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:18.562 "is_configured": true, 00:17:18.562 "data_offset": 2048, 00:17:18.562 "data_size": 63488 00:17:18.562 } 00:17:18.562 ] 00:17:18.562 }' 00:17:18.562 17:11:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.562 17:11:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.130 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:19.130 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:19.390 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:19.390 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:19.390 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:19.649 [2024-07-23 17:11:14.982078] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:19.649 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0 '!=' d5dba9f2-ab8b-40e5-bfc3-5c57abbee4d0 ']' 00:17:19.649 17:11:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4124541 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4124541 ']' 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4124541 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4124541 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4124541' 00:17:19.649 killing process with pid 4124541 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4124541 00:17:19.649 [2024-07-23 17:11:15.070397] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:19.649 [2024-07-23 17:11:15.070451] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:19.649 [2024-07-23 17:11:15.070492] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:19.649 [2024-07-23 17:11:15.070503] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x263ff70 name raid_bdev1, state offline 00:17:19.649 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4124541 00:17:19.909 [2024-07-23 17:11:15.086966] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:19.909 17:11:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:19.909 00:17:19.909 real 0m15.651s 00:17:19.909 user 0m28.489s 00:17:19.909 sys 0m2.865s 00:17:19.909 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:19.909 17:11:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.909 ************************************ 00:17:19.909 END TEST raid_superblock_test 00:17:19.909 ************************************ 00:17:19.909 17:11:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:19.909 17:11:15 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:17:19.909 17:11:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:19.909 17:11:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:19.909 17:11:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:20.169 ************************************ 00:17:20.169 START TEST raid_read_error_test 00:17:20.169 ************************************ 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.xxnPDpJjEU 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4126969 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4126969 /var/tmp/spdk-raid.sock 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4126969 ']' 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:20.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:20.169 17:11:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.169 [2024-07-23 17:11:15.486260] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:17:20.169 [2024-07-23 17:11:15.486397] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4126969 ] 00:17:20.428 [2024-07-23 17:11:15.688122] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:20.428 [2024-07-23 17:11:15.741235] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:20.428 [2024-07-23 17:11:15.805128] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:20.428 [2024-07-23 17:11:15.805156] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:21.367 17:11:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:21.367 17:11:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:21.367 17:11:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:21.367 17:11:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:21.626 BaseBdev1_malloc 00:17:21.626 17:11:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:22.196 true 00:17:22.196 17:11:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:22.456 [2024-07-23 17:11:17.730373] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:22.456 [2024-07-23 17:11:17.730421] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:22.456 [2024-07-23 17:11:17.730440] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26da5c0 00:17:22.456 [2024-07-23 17:11:17.730453] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:22.456 [2024-07-23 17:11:17.732127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:22.456 [2024-07-23 17:11:17.732155] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:22.456 BaseBdev1 00:17:22.456 17:11:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:22.456 17:11:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:23.025 BaseBdev2_malloc 00:17:23.025 17:11:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:23.628 true 00:17:23.628 17:11:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:23.628 [2024-07-23 17:11:19.046619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:23.628 [2024-07-23 17:11:19.046664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:23.628 [2024-07-23 17:11:19.046687] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d4620 00:17:23.628 [2024-07-23 17:11:19.046700] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:23.628 [2024-07-23 17:11:19.048280] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:23.628 [2024-07-23 17:11:19.048306] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:23.887 BaseBdev2 00:17:23.887 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:17:24.147 [2024-07-23 17:11:19.335407] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:24.147 [2024-07-23 17:11:19.336717] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:24.147 [2024-07-23 17:11:19.336907] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2528610 00:17:24.147 [2024-07-23 17:11:19.336921] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:24.147 [2024-07-23 17:11:19.337118] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26d83e0 00:17:24.147 [2024-07-23 17:11:19.337268] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2528610 00:17:24.147 [2024-07-23 17:11:19.337278] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2528610 00:17:24.147 [2024-07-23 17:11:19.337388] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:24.147 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:24.147 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.148 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:24.407 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.407 "name": "raid_bdev1", 00:17:24.407 "uuid": "8d51e105-bb7b-4dcf-9484-810f1dc7b906", 00:17:24.407 "strip_size_kb": 0, 00:17:24.407 "state": "online", 00:17:24.407 "raid_level": "raid1", 00:17:24.407 "superblock": true, 00:17:24.407 "num_base_bdevs": 2, 00:17:24.407 "num_base_bdevs_discovered": 2, 00:17:24.407 "num_base_bdevs_operational": 2, 00:17:24.407 "base_bdevs_list": [ 00:17:24.407 { 00:17:24.407 "name": "BaseBdev1", 00:17:24.407 "uuid": "38dd7bc9-c47c-5835-bad0-5f247322ecf4", 00:17:24.407 "is_configured": true, 00:17:24.407 "data_offset": 2048, 00:17:24.407 "data_size": 63488 00:17:24.407 }, 00:17:24.407 { 00:17:24.407 "name": "BaseBdev2", 00:17:24.407 "uuid": "b379892e-05f8-5e73-be9f-586dcb3b2c7b", 00:17:24.407 "is_configured": true, 00:17:24.407 "data_offset": 2048, 00:17:24.407 "data_size": 63488 00:17:24.408 } 00:17:24.408 ] 00:17:24.408 }' 00:17:24.408 17:11:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.408 17:11:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.976 17:11:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:24.976 17:11:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:24.976 [2024-07-23 17:11:20.326309] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26d63e0 00:17:25.916 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:26.175 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:26.175 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.176 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:26.435 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.435 "name": "raid_bdev1", 00:17:26.435 "uuid": "8d51e105-bb7b-4dcf-9484-810f1dc7b906", 00:17:26.435 "strip_size_kb": 0, 00:17:26.435 "state": "online", 00:17:26.435 "raid_level": "raid1", 00:17:26.435 "superblock": true, 00:17:26.435 "num_base_bdevs": 2, 00:17:26.435 "num_base_bdevs_discovered": 2, 00:17:26.435 "num_base_bdevs_operational": 2, 00:17:26.435 "base_bdevs_list": [ 00:17:26.435 { 00:17:26.435 "name": "BaseBdev1", 00:17:26.435 "uuid": "38dd7bc9-c47c-5835-bad0-5f247322ecf4", 00:17:26.435 "is_configured": true, 00:17:26.435 "data_offset": 2048, 00:17:26.435 "data_size": 63488 00:17:26.435 }, 00:17:26.435 { 00:17:26.435 "name": "BaseBdev2", 00:17:26.435 "uuid": "b379892e-05f8-5e73-be9f-586dcb3b2c7b", 00:17:26.435 "is_configured": true, 00:17:26.435 "data_offset": 2048, 00:17:26.435 "data_size": 63488 00:17:26.435 } 00:17:26.435 ] 00:17:26.435 }' 00:17:26.435 17:11:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.435 17:11:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:27.005 [2024-07-23 17:11:22.328248] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:27.005 [2024-07-23 17:11:22.328290] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:27.005 [2024-07-23 17:11:22.331405] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:27.005 [2024-07-23 17:11:22.331437] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:27.005 [2024-07-23 17:11:22.331508] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:27.005 [2024-07-23 17:11:22.331519] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2528610 name raid_bdev1, state offline 00:17:27.005 0 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4126969 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4126969 ']' 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4126969 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4126969 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4126969' 00:17:27.005 killing process with pid 4126969 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4126969 00:17:27.005 [2024-07-23 17:11:22.413379] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:27.005 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4126969 00:17:27.005 [2024-07-23 17:11:22.424474] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.xxnPDpJjEU 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:27.265 00:17:27.265 real 0m7.281s 00:17:27.265 user 0m11.624s 00:17:27.265 sys 0m1.316s 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:27.265 17:11:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.265 ************************************ 00:17:27.265 END TEST raid_read_error_test 00:17:27.265 ************************************ 00:17:27.525 17:11:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:27.525 17:11:22 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:17:27.525 17:11:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:27.525 17:11:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:27.525 17:11:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:27.525 ************************************ 00:17:27.525 START TEST raid_write_error_test 00:17:27.525 ************************************ 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yqIvYjfpT7 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4128034 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4128034 /var/tmp/spdk-raid.sock 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4128034 ']' 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:27.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:27.525 17:11:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.525 [2024-07-23 17:11:22.811390] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:17:27.525 [2024-07-23 17:11:22.811465] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4128034 ] 00:17:27.525 [2024-07-23 17:11:22.945532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:27.784 [2024-07-23 17:11:23.000000] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:27.784 [2024-07-23 17:11:23.055126] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:27.784 [2024-07-23 17:11:23.055155] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:28.356 17:11:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:28.356 17:11:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:28.356 17:11:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:28.356 17:11:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:28.616 BaseBdev1_malloc 00:17:28.616 17:11:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:28.616 true 00:17:28.875 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:28.875 [2024-07-23 17:11:24.205963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:28.875 [2024-07-23 17:11:24.206008] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.875 [2024-07-23 17:11:24.206028] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x178f5c0 00:17:28.875 [2024-07-23 17:11:24.206041] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.875 [2024-07-23 17:11:24.207566] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.875 [2024-07-23 17:11:24.207595] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:28.875 BaseBdev1 00:17:28.875 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:28.875 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:29.134 BaseBdev2_malloc 00:17:29.134 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:29.393 true 00:17:29.393 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:29.393 [2024-07-23 17:11:24.751998] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:29.393 [2024-07-23 17:11:24.752045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:29.393 [2024-07-23 17:11:24.752067] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1789620 00:17:29.393 [2024-07-23 17:11:24.752079] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:29.393 [2024-07-23 17:11:24.753502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:29.393 [2024-07-23 17:11:24.753530] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:29.393 BaseBdev2 00:17:29.393 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:17:29.653 [2024-07-23 17:11:24.928492] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:29.653 [2024-07-23 17:11:24.929707] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:29.653 [2024-07-23 17:11:24.929884] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15dd610 00:17:29.653 [2024-07-23 17:11:24.929907] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:29.653 [2024-07-23 17:11:24.930097] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x178d3e0 00:17:29.653 [2024-07-23 17:11:24.930243] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15dd610 00:17:29.653 [2024-07-23 17:11:24.930253] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15dd610 00:17:29.653 [2024-07-23 17:11:24.930358] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.653 17:11:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:29.913 17:11:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.913 "name": "raid_bdev1", 00:17:29.913 "uuid": "ee9572eb-8a4a-4c18-991e-188b072c9610", 00:17:29.913 "strip_size_kb": 0, 00:17:29.913 "state": "online", 00:17:29.913 "raid_level": "raid1", 00:17:29.913 "superblock": true, 00:17:29.913 "num_base_bdevs": 2, 00:17:29.913 "num_base_bdevs_discovered": 2, 00:17:29.913 "num_base_bdevs_operational": 2, 00:17:29.913 "base_bdevs_list": [ 00:17:29.913 { 00:17:29.913 "name": "BaseBdev1", 00:17:29.913 "uuid": "21e079e6-df29-5b5c-9d27-f33a05017783", 00:17:29.913 "is_configured": true, 00:17:29.913 "data_offset": 2048, 00:17:29.913 "data_size": 63488 00:17:29.913 }, 00:17:29.913 { 00:17:29.913 "name": "BaseBdev2", 00:17:29.913 "uuid": "b7bba874-ad7a-5571-bb7d-aadff61fda75", 00:17:29.913 "is_configured": true, 00:17:29.913 "data_offset": 2048, 00:17:29.913 "data_size": 63488 00:17:29.913 } 00:17:29.913 ] 00:17:29.913 }' 00:17:29.913 17:11:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.913 17:11:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.483 17:11:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:30.483 17:11:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:30.483 [2024-07-23 17:11:25.855229] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x178b3e0 00:17:31.421 17:11:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:31.680 [2024-07-23 17:11:26.983882] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:31.680 [2024-07-23 17:11:26.983950] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:31.680 [2024-07-23 17:11:26.984131] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x178b3e0 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.680 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:31.940 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.940 "name": "raid_bdev1", 00:17:31.940 "uuid": "ee9572eb-8a4a-4c18-991e-188b072c9610", 00:17:31.940 "strip_size_kb": 0, 00:17:31.940 "state": "online", 00:17:31.940 "raid_level": "raid1", 00:17:31.940 "superblock": true, 00:17:31.940 "num_base_bdevs": 2, 00:17:31.940 "num_base_bdevs_discovered": 1, 00:17:31.940 "num_base_bdevs_operational": 1, 00:17:31.940 "base_bdevs_list": [ 00:17:31.940 { 00:17:31.940 "name": null, 00:17:31.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.940 "is_configured": false, 00:17:31.940 "data_offset": 2048, 00:17:31.940 "data_size": 63488 00:17:31.940 }, 00:17:31.940 { 00:17:31.940 "name": "BaseBdev2", 00:17:31.940 "uuid": "b7bba874-ad7a-5571-bb7d-aadff61fda75", 00:17:31.940 "is_configured": true, 00:17:31.940 "data_offset": 2048, 00:17:31.940 "data_size": 63488 00:17:31.940 } 00:17:31.940 ] 00:17:31.940 }' 00:17:31.940 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.940 17:11:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.510 17:11:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:32.770 [2024-07-23 17:11:28.100256] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:32.770 [2024-07-23 17:11:28.100290] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:32.770 [2024-07-23 17:11:28.103522] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:32.770 [2024-07-23 17:11:28.103549] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:32.770 [2024-07-23 17:11:28.103597] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:32.770 [2024-07-23 17:11:28.103608] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15dd610 name raid_bdev1, state offline 00:17:32.770 0 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4128034 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4128034 ']' 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4128034 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4128034 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4128034' 00:17:32.770 killing process with pid 4128034 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4128034 00:17:32.770 [2024-07-23 17:11:28.173948] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:32.770 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4128034 00:17:32.770 [2024-07-23 17:11:28.185435] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yqIvYjfpT7 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:33.029 17:11:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:33.029 00:17:33.029 real 0m5.683s 00:17:33.029 user 0m8.738s 00:17:33.030 sys 0m1.020s 00:17:33.030 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:33.030 17:11:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.030 ************************************ 00:17:33.030 END TEST raid_write_error_test 00:17:33.030 ************************************ 00:17:33.289 17:11:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:33.289 17:11:28 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:33.289 17:11:28 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:33.289 17:11:28 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:17:33.289 17:11:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:33.289 17:11:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:33.289 17:11:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:33.289 ************************************ 00:17:33.289 START TEST raid_state_function_test 00:17:33.289 ************************************ 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:33.289 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4128913 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4128913' 00:17:33.290 Process raid pid: 4128913 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4128913 /var/tmp/spdk-raid.sock 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4128913 ']' 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:33.290 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:33.290 17:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.290 [2024-07-23 17:11:28.568916] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:17:33.290 [2024-07-23 17:11:28.568981] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:33.290 [2024-07-23 17:11:28.691508] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:33.549 [2024-07-23 17:11:28.746709] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:33.549 [2024-07-23 17:11:28.807703] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:33.549 [2024-07-23 17:11:28.807732] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:33.808 17:11:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:33.808 17:11:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:33.808 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:34.068 [2024-07-23 17:11:29.258587] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:34.068 [2024-07-23 17:11:29.258626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:34.068 [2024-07-23 17:11:29.258637] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:34.068 [2024-07-23 17:11:29.258648] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:34.068 [2024-07-23 17:11:29.258657] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:34.068 [2024-07-23 17:11:29.258668] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.068 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.328 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.328 "name": "Existed_Raid", 00:17:34.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.328 "strip_size_kb": 64, 00:17:34.328 "state": "configuring", 00:17:34.328 "raid_level": "raid0", 00:17:34.328 "superblock": false, 00:17:34.328 "num_base_bdevs": 3, 00:17:34.328 "num_base_bdevs_discovered": 0, 00:17:34.328 "num_base_bdevs_operational": 3, 00:17:34.328 "base_bdevs_list": [ 00:17:34.328 { 00:17:34.328 "name": "BaseBdev1", 00:17:34.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.328 "is_configured": false, 00:17:34.328 "data_offset": 0, 00:17:34.328 "data_size": 0 00:17:34.328 }, 00:17:34.328 { 00:17:34.328 "name": "BaseBdev2", 00:17:34.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.328 "is_configured": false, 00:17:34.328 "data_offset": 0, 00:17:34.328 "data_size": 0 00:17:34.328 }, 00:17:34.328 { 00:17:34.328 "name": "BaseBdev3", 00:17:34.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.328 "is_configured": false, 00:17:34.328 "data_offset": 0, 00:17:34.328 "data_size": 0 00:17:34.328 } 00:17:34.328 ] 00:17:34.328 }' 00:17:34.328 17:11:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.328 17:11:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.897 17:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:35.155 [2024-07-23 17:11:30.573910] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:35.155 [2024-07-23 17:11:30.573943] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x100d280 name Existed_Raid, state configuring 00:17:35.414 17:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:35.414 [2024-07-23 17:11:30.830596] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:35.414 [2024-07-23 17:11:30.830628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:35.414 [2024-07-23 17:11:30.830638] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:35.414 [2024-07-23 17:11:30.830650] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:35.414 [2024-07-23 17:11:30.830659] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:35.414 [2024-07-23 17:11:30.830671] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:35.674 17:11:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:35.674 [2024-07-23 17:11:31.081053] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:35.674 BaseBdev1 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:35.934 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:36.193 [ 00:17:36.193 { 00:17:36.193 "name": "BaseBdev1", 00:17:36.193 "aliases": [ 00:17:36.193 "d0093fdb-a161-4215-bf88-5f7f7c6f9665" 00:17:36.193 ], 00:17:36.193 "product_name": "Malloc disk", 00:17:36.193 "block_size": 512, 00:17:36.193 "num_blocks": 65536, 00:17:36.193 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:36.193 "assigned_rate_limits": { 00:17:36.193 "rw_ios_per_sec": 0, 00:17:36.193 "rw_mbytes_per_sec": 0, 00:17:36.193 "r_mbytes_per_sec": 0, 00:17:36.193 "w_mbytes_per_sec": 0 00:17:36.193 }, 00:17:36.193 "claimed": true, 00:17:36.193 "claim_type": "exclusive_write", 00:17:36.193 "zoned": false, 00:17:36.193 "supported_io_types": { 00:17:36.193 "read": true, 00:17:36.193 "write": true, 00:17:36.193 "unmap": true, 00:17:36.193 "flush": true, 00:17:36.193 "reset": true, 00:17:36.193 "nvme_admin": false, 00:17:36.193 "nvme_io": false, 00:17:36.193 "nvme_io_md": false, 00:17:36.193 "write_zeroes": true, 00:17:36.193 "zcopy": true, 00:17:36.193 "get_zone_info": false, 00:17:36.193 "zone_management": false, 00:17:36.193 "zone_append": false, 00:17:36.193 "compare": false, 00:17:36.193 "compare_and_write": false, 00:17:36.194 "abort": true, 00:17:36.194 "seek_hole": false, 00:17:36.194 "seek_data": false, 00:17:36.194 "copy": true, 00:17:36.194 "nvme_iov_md": false 00:17:36.194 }, 00:17:36.194 "memory_domains": [ 00:17:36.194 { 00:17:36.194 "dma_device_id": "system", 00:17:36.194 "dma_device_type": 1 00:17:36.194 }, 00:17:36.194 { 00:17:36.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.194 "dma_device_type": 2 00:17:36.194 } 00:17:36.194 ], 00:17:36.194 "driver_specific": {} 00:17:36.194 } 00:17:36.194 ] 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.194 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.452 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.452 "name": "Existed_Raid", 00:17:36.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.452 "strip_size_kb": 64, 00:17:36.452 "state": "configuring", 00:17:36.452 "raid_level": "raid0", 00:17:36.452 "superblock": false, 00:17:36.452 "num_base_bdevs": 3, 00:17:36.452 "num_base_bdevs_discovered": 1, 00:17:36.452 "num_base_bdevs_operational": 3, 00:17:36.452 "base_bdevs_list": [ 00:17:36.452 { 00:17:36.452 "name": "BaseBdev1", 00:17:36.452 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:36.452 "is_configured": true, 00:17:36.452 "data_offset": 0, 00:17:36.452 "data_size": 65536 00:17:36.452 }, 00:17:36.452 { 00:17:36.452 "name": "BaseBdev2", 00:17:36.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.452 "is_configured": false, 00:17:36.452 "data_offset": 0, 00:17:36.452 "data_size": 0 00:17:36.452 }, 00:17:36.452 { 00:17:36.452 "name": "BaseBdev3", 00:17:36.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.452 "is_configured": false, 00:17:36.452 "data_offset": 0, 00:17:36.452 "data_size": 0 00:17:36.452 } 00:17:36.452 ] 00:17:36.452 }' 00:17:36.452 17:11:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.452 17:11:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.019 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:37.278 [2024-07-23 17:11:32.657216] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:37.278 [2024-07-23 17:11:32.657256] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x100cbb0 name Existed_Raid, state configuring 00:17:37.278 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:37.536 [2024-07-23 17:11:32.905910] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:37.536 [2024-07-23 17:11:32.907329] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:37.536 [2024-07-23 17:11:32.907362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:37.536 [2024-07-23 17:11:32.907372] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:37.536 [2024-07-23 17:11:32.907383] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.536 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.537 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.537 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.537 17:11:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.827 17:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.827 "name": "Existed_Raid", 00:17:37.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.827 "strip_size_kb": 64, 00:17:37.827 "state": "configuring", 00:17:37.827 "raid_level": "raid0", 00:17:37.827 "superblock": false, 00:17:37.827 "num_base_bdevs": 3, 00:17:37.827 "num_base_bdevs_discovered": 1, 00:17:37.827 "num_base_bdevs_operational": 3, 00:17:37.827 "base_bdevs_list": [ 00:17:37.827 { 00:17:37.828 "name": "BaseBdev1", 00:17:37.828 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:37.828 "is_configured": true, 00:17:37.828 "data_offset": 0, 00:17:37.828 "data_size": 65536 00:17:37.828 }, 00:17:37.828 { 00:17:37.828 "name": "BaseBdev2", 00:17:37.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.828 "is_configured": false, 00:17:37.828 "data_offset": 0, 00:17:37.828 "data_size": 0 00:17:37.828 }, 00:17:37.828 { 00:17:37.828 "name": "BaseBdev3", 00:17:37.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.828 "is_configured": false, 00:17:37.828 "data_offset": 0, 00:17:37.828 "data_size": 0 00:17:37.828 } 00:17:37.828 ] 00:17:37.828 }' 00:17:37.828 17:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.828 17:11:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.394 17:11:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:38.961 [2024-07-23 17:11:34.268852] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:38.961 BaseBdev2 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:38.961 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.529 17:11:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:40.097 [ 00:17:40.097 { 00:17:40.097 "name": "BaseBdev2", 00:17:40.097 "aliases": [ 00:17:40.097 "107009b9-bc26-43d0-8d76-09b1b870937b" 00:17:40.097 ], 00:17:40.097 "product_name": "Malloc disk", 00:17:40.097 "block_size": 512, 00:17:40.097 "num_blocks": 65536, 00:17:40.097 "uuid": "107009b9-bc26-43d0-8d76-09b1b870937b", 00:17:40.097 "assigned_rate_limits": { 00:17:40.097 "rw_ios_per_sec": 0, 00:17:40.097 "rw_mbytes_per_sec": 0, 00:17:40.097 "r_mbytes_per_sec": 0, 00:17:40.097 "w_mbytes_per_sec": 0 00:17:40.097 }, 00:17:40.097 "claimed": true, 00:17:40.097 "claim_type": "exclusive_write", 00:17:40.097 "zoned": false, 00:17:40.097 "supported_io_types": { 00:17:40.097 "read": true, 00:17:40.097 "write": true, 00:17:40.097 "unmap": true, 00:17:40.097 "flush": true, 00:17:40.097 "reset": true, 00:17:40.097 "nvme_admin": false, 00:17:40.097 "nvme_io": false, 00:17:40.097 "nvme_io_md": false, 00:17:40.097 "write_zeroes": true, 00:17:40.097 "zcopy": true, 00:17:40.097 "get_zone_info": false, 00:17:40.097 "zone_management": false, 00:17:40.097 "zone_append": false, 00:17:40.097 "compare": false, 00:17:40.097 "compare_and_write": false, 00:17:40.097 "abort": true, 00:17:40.097 "seek_hole": false, 00:17:40.097 "seek_data": false, 00:17:40.097 "copy": true, 00:17:40.097 "nvme_iov_md": false 00:17:40.097 }, 00:17:40.097 "memory_domains": [ 00:17:40.097 { 00:17:40.097 "dma_device_id": "system", 00:17:40.097 "dma_device_type": 1 00:17:40.097 }, 00:17:40.097 { 00:17:40.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.097 "dma_device_type": 2 00:17:40.097 } 00:17:40.097 ], 00:17:40.097 "driver_specific": {} 00:17:40.097 } 00:17:40.097 ] 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.097 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.665 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.665 "name": "Existed_Raid", 00:17:40.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.665 "strip_size_kb": 64, 00:17:40.665 "state": "configuring", 00:17:40.665 "raid_level": "raid0", 00:17:40.665 "superblock": false, 00:17:40.665 "num_base_bdevs": 3, 00:17:40.665 "num_base_bdevs_discovered": 2, 00:17:40.665 "num_base_bdevs_operational": 3, 00:17:40.665 "base_bdevs_list": [ 00:17:40.665 { 00:17:40.665 "name": "BaseBdev1", 00:17:40.665 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:40.665 "is_configured": true, 00:17:40.665 "data_offset": 0, 00:17:40.665 "data_size": 65536 00:17:40.665 }, 00:17:40.665 { 00:17:40.665 "name": "BaseBdev2", 00:17:40.665 "uuid": "107009b9-bc26-43d0-8d76-09b1b870937b", 00:17:40.665 "is_configured": true, 00:17:40.665 "data_offset": 0, 00:17:40.665 "data_size": 65536 00:17:40.665 }, 00:17:40.665 { 00:17:40.665 "name": "BaseBdev3", 00:17:40.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.665 "is_configured": false, 00:17:40.665 "data_offset": 0, 00:17:40.665 "data_size": 0 00:17:40.665 } 00:17:40.665 ] 00:17:40.665 }' 00:17:40.665 17:11:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.665 17:11:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.234 17:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:41.802 [2024-07-23 17:11:36.932450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.802 [2024-07-23 17:11:36.932486] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x100c800 00:17:41.802 [2024-07-23 17:11:36.932495] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:17:41.802 [2024-07-23 17:11:36.932746] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1010b50 00:17:41.802 [2024-07-23 17:11:36.932862] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x100c800 00:17:41.802 [2024-07-23 17:11:36.932872] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x100c800 00:17:41.802 [2024-07-23 17:11:36.933049] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:41.802 BaseBdev3 00:17:41.802 17:11:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:41.802 17:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:41.802 17:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:41.803 17:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:41.803 17:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:41.803 17:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:41.803 17:11:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:41.803 17:11:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:42.371 [ 00:17:42.371 { 00:17:42.371 "name": "BaseBdev3", 00:17:42.371 "aliases": [ 00:17:42.371 "c3cf7a11-4395-4c38-a42c-6c9fdd01a156" 00:17:42.371 ], 00:17:42.371 "product_name": "Malloc disk", 00:17:42.371 "block_size": 512, 00:17:42.371 "num_blocks": 65536, 00:17:42.371 "uuid": "c3cf7a11-4395-4c38-a42c-6c9fdd01a156", 00:17:42.372 "assigned_rate_limits": { 00:17:42.372 "rw_ios_per_sec": 0, 00:17:42.372 "rw_mbytes_per_sec": 0, 00:17:42.372 "r_mbytes_per_sec": 0, 00:17:42.372 "w_mbytes_per_sec": 0 00:17:42.372 }, 00:17:42.372 "claimed": true, 00:17:42.372 "claim_type": "exclusive_write", 00:17:42.372 "zoned": false, 00:17:42.372 "supported_io_types": { 00:17:42.372 "read": true, 00:17:42.372 "write": true, 00:17:42.372 "unmap": true, 00:17:42.372 "flush": true, 00:17:42.372 "reset": true, 00:17:42.372 "nvme_admin": false, 00:17:42.372 "nvme_io": false, 00:17:42.372 "nvme_io_md": false, 00:17:42.372 "write_zeroes": true, 00:17:42.372 "zcopy": true, 00:17:42.372 "get_zone_info": false, 00:17:42.372 "zone_management": false, 00:17:42.372 "zone_append": false, 00:17:42.372 "compare": false, 00:17:42.372 "compare_and_write": false, 00:17:42.372 "abort": true, 00:17:42.372 "seek_hole": false, 00:17:42.372 "seek_data": false, 00:17:42.372 "copy": true, 00:17:42.372 "nvme_iov_md": false 00:17:42.372 }, 00:17:42.372 "memory_domains": [ 00:17:42.372 { 00:17:42.372 "dma_device_id": "system", 00:17:42.372 "dma_device_type": 1 00:17:42.372 }, 00:17:42.372 { 00:17:42.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.372 "dma_device_type": 2 00:17:42.372 } 00:17:42.372 ], 00:17:42.372 "driver_specific": {} 00:17:42.372 } 00:17:42.372 ] 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.372 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.631 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.631 "name": "Existed_Raid", 00:17:42.631 "uuid": "5bacf801-1a1b-49b4-850b-16d1745ee859", 00:17:42.631 "strip_size_kb": 64, 00:17:42.631 "state": "online", 00:17:42.631 "raid_level": "raid0", 00:17:42.631 "superblock": false, 00:17:42.631 "num_base_bdevs": 3, 00:17:42.631 "num_base_bdevs_discovered": 3, 00:17:42.631 "num_base_bdevs_operational": 3, 00:17:42.631 "base_bdevs_list": [ 00:17:42.631 { 00:17:42.631 "name": "BaseBdev1", 00:17:42.632 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:42.632 "is_configured": true, 00:17:42.632 "data_offset": 0, 00:17:42.632 "data_size": 65536 00:17:42.632 }, 00:17:42.632 { 00:17:42.632 "name": "BaseBdev2", 00:17:42.632 "uuid": "107009b9-bc26-43d0-8d76-09b1b870937b", 00:17:42.632 "is_configured": true, 00:17:42.632 "data_offset": 0, 00:17:42.632 "data_size": 65536 00:17:42.632 }, 00:17:42.632 { 00:17:42.632 "name": "BaseBdev3", 00:17:42.632 "uuid": "c3cf7a11-4395-4c38-a42c-6c9fdd01a156", 00:17:42.632 "is_configured": true, 00:17:42.632 "data_offset": 0, 00:17:42.632 "data_size": 65536 00:17:42.632 } 00:17:42.632 ] 00:17:42.632 }' 00:17:42.632 17:11:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.632 17:11:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:43.200 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:43.459 [2024-07-23 17:11:38.717487] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:43.459 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:43.459 "name": "Existed_Raid", 00:17:43.459 "aliases": [ 00:17:43.459 "5bacf801-1a1b-49b4-850b-16d1745ee859" 00:17:43.459 ], 00:17:43.459 "product_name": "Raid Volume", 00:17:43.459 "block_size": 512, 00:17:43.459 "num_blocks": 196608, 00:17:43.459 "uuid": "5bacf801-1a1b-49b4-850b-16d1745ee859", 00:17:43.459 "assigned_rate_limits": { 00:17:43.459 "rw_ios_per_sec": 0, 00:17:43.459 "rw_mbytes_per_sec": 0, 00:17:43.459 "r_mbytes_per_sec": 0, 00:17:43.459 "w_mbytes_per_sec": 0 00:17:43.459 }, 00:17:43.459 "claimed": false, 00:17:43.459 "zoned": false, 00:17:43.459 "supported_io_types": { 00:17:43.459 "read": true, 00:17:43.459 "write": true, 00:17:43.459 "unmap": true, 00:17:43.459 "flush": true, 00:17:43.459 "reset": true, 00:17:43.459 "nvme_admin": false, 00:17:43.459 "nvme_io": false, 00:17:43.459 "nvme_io_md": false, 00:17:43.459 "write_zeroes": true, 00:17:43.459 "zcopy": false, 00:17:43.459 "get_zone_info": false, 00:17:43.459 "zone_management": false, 00:17:43.459 "zone_append": false, 00:17:43.459 "compare": false, 00:17:43.459 "compare_and_write": false, 00:17:43.459 "abort": false, 00:17:43.459 "seek_hole": false, 00:17:43.459 "seek_data": false, 00:17:43.459 "copy": false, 00:17:43.459 "nvme_iov_md": false 00:17:43.459 }, 00:17:43.459 "memory_domains": [ 00:17:43.459 { 00:17:43.460 "dma_device_id": "system", 00:17:43.460 "dma_device_type": 1 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.460 "dma_device_type": 2 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "dma_device_id": "system", 00:17:43.460 "dma_device_type": 1 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.460 "dma_device_type": 2 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "dma_device_id": "system", 00:17:43.460 "dma_device_type": 1 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.460 "dma_device_type": 2 00:17:43.460 } 00:17:43.460 ], 00:17:43.460 "driver_specific": { 00:17:43.460 "raid": { 00:17:43.460 "uuid": "5bacf801-1a1b-49b4-850b-16d1745ee859", 00:17:43.460 "strip_size_kb": 64, 00:17:43.460 "state": "online", 00:17:43.460 "raid_level": "raid0", 00:17:43.460 "superblock": false, 00:17:43.460 "num_base_bdevs": 3, 00:17:43.460 "num_base_bdevs_discovered": 3, 00:17:43.460 "num_base_bdevs_operational": 3, 00:17:43.460 "base_bdevs_list": [ 00:17:43.460 { 00:17:43.460 "name": "BaseBdev1", 00:17:43.460 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:43.460 "is_configured": true, 00:17:43.460 "data_offset": 0, 00:17:43.460 "data_size": 65536 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "name": "BaseBdev2", 00:17:43.460 "uuid": "107009b9-bc26-43d0-8d76-09b1b870937b", 00:17:43.460 "is_configured": true, 00:17:43.460 "data_offset": 0, 00:17:43.460 "data_size": 65536 00:17:43.460 }, 00:17:43.460 { 00:17:43.460 "name": "BaseBdev3", 00:17:43.460 "uuid": "c3cf7a11-4395-4c38-a42c-6c9fdd01a156", 00:17:43.460 "is_configured": true, 00:17:43.460 "data_offset": 0, 00:17:43.460 "data_size": 65536 00:17:43.460 } 00:17:43.460 ] 00:17:43.460 } 00:17:43.460 } 00:17:43.460 }' 00:17:43.460 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:43.460 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:43.460 BaseBdev2 00:17:43.460 BaseBdev3' 00:17:43.460 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.460 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:43.460 17:11:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.720 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.720 "name": "BaseBdev1", 00:17:43.720 "aliases": [ 00:17:43.720 "d0093fdb-a161-4215-bf88-5f7f7c6f9665" 00:17:43.720 ], 00:17:43.720 "product_name": "Malloc disk", 00:17:43.720 "block_size": 512, 00:17:43.720 "num_blocks": 65536, 00:17:43.720 "uuid": "d0093fdb-a161-4215-bf88-5f7f7c6f9665", 00:17:43.720 "assigned_rate_limits": { 00:17:43.720 "rw_ios_per_sec": 0, 00:17:43.720 "rw_mbytes_per_sec": 0, 00:17:43.720 "r_mbytes_per_sec": 0, 00:17:43.720 "w_mbytes_per_sec": 0 00:17:43.720 }, 00:17:43.720 "claimed": true, 00:17:43.720 "claim_type": "exclusive_write", 00:17:43.720 "zoned": false, 00:17:43.720 "supported_io_types": { 00:17:43.720 "read": true, 00:17:43.720 "write": true, 00:17:43.720 "unmap": true, 00:17:43.720 "flush": true, 00:17:43.720 "reset": true, 00:17:43.720 "nvme_admin": false, 00:17:43.720 "nvme_io": false, 00:17:43.720 "nvme_io_md": false, 00:17:43.720 "write_zeroes": true, 00:17:43.720 "zcopy": true, 00:17:43.720 "get_zone_info": false, 00:17:43.720 "zone_management": false, 00:17:43.720 "zone_append": false, 00:17:43.720 "compare": false, 00:17:43.720 "compare_and_write": false, 00:17:43.720 "abort": true, 00:17:43.720 "seek_hole": false, 00:17:43.720 "seek_data": false, 00:17:43.720 "copy": true, 00:17:43.720 "nvme_iov_md": false 00:17:43.720 }, 00:17:43.720 "memory_domains": [ 00:17:43.720 { 00:17:43.720 "dma_device_id": "system", 00:17:43.720 "dma_device_type": 1 00:17:43.720 }, 00:17:43.720 { 00:17:43.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.720 "dma_device_type": 2 00:17:43.720 } 00:17:43.720 ], 00:17:43.720 "driver_specific": {} 00:17:43.720 }' 00:17:43.720 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.720 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.980 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.239 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.239 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.239 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:44.239 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:44.498 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:44.498 "name": "BaseBdev2", 00:17:44.498 "aliases": [ 00:17:44.498 "107009b9-bc26-43d0-8d76-09b1b870937b" 00:17:44.498 ], 00:17:44.498 "product_name": "Malloc disk", 00:17:44.498 "block_size": 512, 00:17:44.498 "num_blocks": 65536, 00:17:44.498 "uuid": "107009b9-bc26-43d0-8d76-09b1b870937b", 00:17:44.498 "assigned_rate_limits": { 00:17:44.498 "rw_ios_per_sec": 0, 00:17:44.498 "rw_mbytes_per_sec": 0, 00:17:44.498 "r_mbytes_per_sec": 0, 00:17:44.498 "w_mbytes_per_sec": 0 00:17:44.498 }, 00:17:44.498 "claimed": true, 00:17:44.498 "claim_type": "exclusive_write", 00:17:44.498 "zoned": false, 00:17:44.498 "supported_io_types": { 00:17:44.498 "read": true, 00:17:44.498 "write": true, 00:17:44.498 "unmap": true, 00:17:44.498 "flush": true, 00:17:44.498 "reset": true, 00:17:44.498 "nvme_admin": false, 00:17:44.498 "nvme_io": false, 00:17:44.498 "nvme_io_md": false, 00:17:44.498 "write_zeroes": true, 00:17:44.498 "zcopy": true, 00:17:44.499 "get_zone_info": false, 00:17:44.499 "zone_management": false, 00:17:44.499 "zone_append": false, 00:17:44.499 "compare": false, 00:17:44.499 "compare_and_write": false, 00:17:44.499 "abort": true, 00:17:44.499 "seek_hole": false, 00:17:44.499 "seek_data": false, 00:17:44.499 "copy": true, 00:17:44.499 "nvme_iov_md": false 00:17:44.499 }, 00:17:44.499 "memory_domains": [ 00:17:44.499 { 00:17:44.499 "dma_device_id": "system", 00:17:44.499 "dma_device_type": 1 00:17:44.499 }, 00:17:44.499 { 00:17:44.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.499 "dma_device_type": 2 00:17:44.499 } 00:17:44.499 ], 00:17:44.499 "driver_specific": {} 00:17:44.499 }' 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.499 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.758 17:11:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.758 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.758 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.758 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:44.758 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.017 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.017 "name": "BaseBdev3", 00:17:45.017 "aliases": [ 00:17:45.017 "c3cf7a11-4395-4c38-a42c-6c9fdd01a156" 00:17:45.017 ], 00:17:45.017 "product_name": "Malloc disk", 00:17:45.017 "block_size": 512, 00:17:45.017 "num_blocks": 65536, 00:17:45.017 "uuid": "c3cf7a11-4395-4c38-a42c-6c9fdd01a156", 00:17:45.017 "assigned_rate_limits": { 00:17:45.017 "rw_ios_per_sec": 0, 00:17:45.017 "rw_mbytes_per_sec": 0, 00:17:45.017 "r_mbytes_per_sec": 0, 00:17:45.017 "w_mbytes_per_sec": 0 00:17:45.017 }, 00:17:45.017 "claimed": true, 00:17:45.017 "claim_type": "exclusive_write", 00:17:45.017 "zoned": false, 00:17:45.018 "supported_io_types": { 00:17:45.018 "read": true, 00:17:45.018 "write": true, 00:17:45.018 "unmap": true, 00:17:45.018 "flush": true, 00:17:45.018 "reset": true, 00:17:45.018 "nvme_admin": false, 00:17:45.018 "nvme_io": false, 00:17:45.018 "nvme_io_md": false, 00:17:45.018 "write_zeroes": true, 00:17:45.018 "zcopy": true, 00:17:45.018 "get_zone_info": false, 00:17:45.018 "zone_management": false, 00:17:45.018 "zone_append": false, 00:17:45.018 "compare": false, 00:17:45.018 "compare_and_write": false, 00:17:45.018 "abort": true, 00:17:45.018 "seek_hole": false, 00:17:45.018 "seek_data": false, 00:17:45.018 "copy": true, 00:17:45.018 "nvme_iov_md": false 00:17:45.018 }, 00:17:45.018 "memory_domains": [ 00:17:45.018 { 00:17:45.018 "dma_device_id": "system", 00:17:45.018 "dma_device_type": 1 00:17:45.018 }, 00:17:45.018 { 00:17:45.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.018 "dma_device_type": 2 00:17:45.018 } 00:17:45.018 ], 00:17:45.018 "driver_specific": {} 00:17:45.018 }' 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.018 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.277 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.277 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.277 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.277 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.277 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.277 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:45.537 [2024-07-23 17:11:40.814807] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:45.537 [2024-07-23 17:11:40.814835] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:45.537 [2024-07-23 17:11:40.814879] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.537 17:11:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.796 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.796 "name": "Existed_Raid", 00:17:45.796 "uuid": "5bacf801-1a1b-49b4-850b-16d1745ee859", 00:17:45.796 "strip_size_kb": 64, 00:17:45.796 "state": "offline", 00:17:45.796 "raid_level": "raid0", 00:17:45.796 "superblock": false, 00:17:45.796 "num_base_bdevs": 3, 00:17:45.796 "num_base_bdevs_discovered": 2, 00:17:45.796 "num_base_bdevs_operational": 2, 00:17:45.796 "base_bdevs_list": [ 00:17:45.796 { 00:17:45.796 "name": null, 00:17:45.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.797 "is_configured": false, 00:17:45.797 "data_offset": 0, 00:17:45.797 "data_size": 65536 00:17:45.797 }, 00:17:45.797 { 00:17:45.797 "name": "BaseBdev2", 00:17:45.797 "uuid": "107009b9-bc26-43d0-8d76-09b1b870937b", 00:17:45.797 "is_configured": true, 00:17:45.797 "data_offset": 0, 00:17:45.797 "data_size": 65536 00:17:45.797 }, 00:17:45.797 { 00:17:45.797 "name": "BaseBdev3", 00:17:45.797 "uuid": "c3cf7a11-4395-4c38-a42c-6c9fdd01a156", 00:17:45.797 "is_configured": true, 00:17:45.797 "data_offset": 0, 00:17:45.797 "data_size": 65536 00:17:45.797 } 00:17:45.797 ] 00:17:45.797 }' 00:17:45.797 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.797 17:11:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.365 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:46.365 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:46.365 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.365 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:46.624 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:46.624 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:46.624 17:11:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:46.883 [2024-07-23 17:11:42.139368] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:46.883 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:46.883 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:46.883 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.883 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:47.143 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:47.143 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:47.143 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:47.403 [2024-07-23 17:11:42.649072] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:47.403 [2024-07-23 17:11:42.649115] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x100c800 name Existed_Raid, state offline 00:17:47.403 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:47.403 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:47.403 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.403 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:47.662 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:47.662 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:47.662 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:47.662 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:47.662 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:47.662 17:11:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:47.920 BaseBdev2 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:47.920 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.178 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:48.436 [ 00:17:48.436 { 00:17:48.436 "name": "BaseBdev2", 00:17:48.436 "aliases": [ 00:17:48.436 "0b0c6dbb-fe5e-4b75-889d-4a1667e04853" 00:17:48.436 ], 00:17:48.436 "product_name": "Malloc disk", 00:17:48.436 "block_size": 512, 00:17:48.436 "num_blocks": 65536, 00:17:48.436 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:48.436 "assigned_rate_limits": { 00:17:48.436 "rw_ios_per_sec": 0, 00:17:48.436 "rw_mbytes_per_sec": 0, 00:17:48.436 "r_mbytes_per_sec": 0, 00:17:48.436 "w_mbytes_per_sec": 0 00:17:48.436 }, 00:17:48.436 "claimed": false, 00:17:48.436 "zoned": false, 00:17:48.436 "supported_io_types": { 00:17:48.436 "read": true, 00:17:48.436 "write": true, 00:17:48.436 "unmap": true, 00:17:48.436 "flush": true, 00:17:48.436 "reset": true, 00:17:48.436 "nvme_admin": false, 00:17:48.436 "nvme_io": false, 00:17:48.436 "nvme_io_md": false, 00:17:48.436 "write_zeroes": true, 00:17:48.436 "zcopy": true, 00:17:48.436 "get_zone_info": false, 00:17:48.436 "zone_management": false, 00:17:48.436 "zone_append": false, 00:17:48.436 "compare": false, 00:17:48.436 "compare_and_write": false, 00:17:48.436 "abort": true, 00:17:48.436 "seek_hole": false, 00:17:48.436 "seek_data": false, 00:17:48.436 "copy": true, 00:17:48.436 "nvme_iov_md": false 00:17:48.436 }, 00:17:48.436 "memory_domains": [ 00:17:48.436 { 00:17:48.436 "dma_device_id": "system", 00:17:48.436 "dma_device_type": 1 00:17:48.436 }, 00:17:48.436 { 00:17:48.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.436 "dma_device_type": 2 00:17:48.436 } 00:17:48.436 ], 00:17:48.436 "driver_specific": {} 00:17:48.436 } 00:17:48.436 ] 00:17:48.436 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:48.436 17:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:48.436 17:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:48.436 17:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:48.695 BaseBdev3 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:48.695 17:11:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.955 17:11:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:48.955 [ 00:17:48.955 { 00:17:48.955 "name": "BaseBdev3", 00:17:48.955 "aliases": [ 00:17:48.955 "10a6c59a-6b7f-4a6d-8dad-3f7966710d02" 00:17:48.955 ], 00:17:48.955 "product_name": "Malloc disk", 00:17:48.955 "block_size": 512, 00:17:48.955 "num_blocks": 65536, 00:17:48.955 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:48.955 "assigned_rate_limits": { 00:17:48.955 "rw_ios_per_sec": 0, 00:17:48.955 "rw_mbytes_per_sec": 0, 00:17:48.955 "r_mbytes_per_sec": 0, 00:17:48.955 "w_mbytes_per_sec": 0 00:17:48.955 }, 00:17:48.955 "claimed": false, 00:17:48.955 "zoned": false, 00:17:48.955 "supported_io_types": { 00:17:48.955 "read": true, 00:17:48.955 "write": true, 00:17:48.955 "unmap": true, 00:17:48.955 "flush": true, 00:17:48.955 "reset": true, 00:17:48.955 "nvme_admin": false, 00:17:48.955 "nvme_io": false, 00:17:48.955 "nvme_io_md": false, 00:17:48.955 "write_zeroes": true, 00:17:48.955 "zcopy": true, 00:17:48.955 "get_zone_info": false, 00:17:48.955 "zone_management": false, 00:17:48.955 "zone_append": false, 00:17:48.955 "compare": false, 00:17:48.955 "compare_and_write": false, 00:17:48.955 "abort": true, 00:17:48.955 "seek_hole": false, 00:17:48.955 "seek_data": false, 00:17:48.955 "copy": true, 00:17:48.955 "nvme_iov_md": false 00:17:48.955 }, 00:17:48.955 "memory_domains": [ 00:17:48.955 { 00:17:48.955 "dma_device_id": "system", 00:17:48.955 "dma_device_type": 1 00:17:48.955 }, 00:17:48.955 { 00:17:48.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.955 "dma_device_type": 2 00:17:48.955 } 00:17:48.955 ], 00:17:48.955 "driver_specific": {} 00:17:48.955 } 00:17:48.955 ] 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:49.214 [2024-07-23 17:11:44.609408] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:49.214 [2024-07-23 17:11:44.609458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:49.214 [2024-07-23 17:11:44.609477] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:49.214 [2024-07-23 17:11:44.610813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.214 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.474 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.474 "name": "Existed_Raid", 00:17:49.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.474 "strip_size_kb": 64, 00:17:49.474 "state": "configuring", 00:17:49.474 "raid_level": "raid0", 00:17:49.474 "superblock": false, 00:17:49.474 "num_base_bdevs": 3, 00:17:49.474 "num_base_bdevs_discovered": 2, 00:17:49.474 "num_base_bdevs_operational": 3, 00:17:49.474 "base_bdevs_list": [ 00:17:49.474 { 00:17:49.474 "name": "BaseBdev1", 00:17:49.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.474 "is_configured": false, 00:17:49.474 "data_offset": 0, 00:17:49.474 "data_size": 0 00:17:49.474 }, 00:17:49.474 { 00:17:49.474 "name": "BaseBdev2", 00:17:49.474 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:49.474 "is_configured": true, 00:17:49.474 "data_offset": 0, 00:17:49.474 "data_size": 65536 00:17:49.474 }, 00:17:49.474 { 00:17:49.474 "name": "BaseBdev3", 00:17:49.474 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:49.474 "is_configured": true, 00:17:49.474 "data_offset": 0, 00:17:49.474 "data_size": 65536 00:17:49.474 } 00:17:49.474 ] 00:17:49.474 }' 00:17:49.474 17:11:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.474 17:11:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:50.411 [2024-07-23 17:11:45.716352] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.411 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.670 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.670 "name": "Existed_Raid", 00:17:50.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.670 "strip_size_kb": 64, 00:17:50.670 "state": "configuring", 00:17:50.670 "raid_level": "raid0", 00:17:50.670 "superblock": false, 00:17:50.670 "num_base_bdevs": 3, 00:17:50.670 "num_base_bdevs_discovered": 1, 00:17:50.670 "num_base_bdevs_operational": 3, 00:17:50.670 "base_bdevs_list": [ 00:17:50.670 { 00:17:50.670 "name": "BaseBdev1", 00:17:50.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.670 "is_configured": false, 00:17:50.670 "data_offset": 0, 00:17:50.671 "data_size": 0 00:17:50.671 }, 00:17:50.671 { 00:17:50.671 "name": null, 00:17:50.671 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:50.671 "is_configured": false, 00:17:50.671 "data_offset": 0, 00:17:50.671 "data_size": 65536 00:17:50.671 }, 00:17:50.671 { 00:17:50.671 "name": "BaseBdev3", 00:17:50.671 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:50.671 "is_configured": true, 00:17:50.671 "data_offset": 0, 00:17:50.671 "data_size": 65536 00:17:50.671 } 00:17:50.671 ] 00:17:50.671 }' 00:17:50.671 17:11:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.671 17:11:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.239 17:11:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.239 17:11:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:51.498 17:11:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:51.498 17:11:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:51.757 [2024-07-23 17:11:46.967420] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:51.757 BaseBdev1 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:51.757 17:11:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.016 17:11:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:52.310 [ 00:17:52.310 { 00:17:52.310 "name": "BaseBdev1", 00:17:52.310 "aliases": [ 00:17:52.310 "c4ab5f9a-8a91-43b6-9470-5560514557d9" 00:17:52.310 ], 00:17:52.310 "product_name": "Malloc disk", 00:17:52.310 "block_size": 512, 00:17:52.310 "num_blocks": 65536, 00:17:52.310 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:52.311 "assigned_rate_limits": { 00:17:52.311 "rw_ios_per_sec": 0, 00:17:52.311 "rw_mbytes_per_sec": 0, 00:17:52.311 "r_mbytes_per_sec": 0, 00:17:52.311 "w_mbytes_per_sec": 0 00:17:52.311 }, 00:17:52.311 "claimed": true, 00:17:52.311 "claim_type": "exclusive_write", 00:17:52.311 "zoned": false, 00:17:52.311 "supported_io_types": { 00:17:52.311 "read": true, 00:17:52.311 "write": true, 00:17:52.311 "unmap": true, 00:17:52.311 "flush": true, 00:17:52.311 "reset": true, 00:17:52.311 "nvme_admin": false, 00:17:52.311 "nvme_io": false, 00:17:52.311 "nvme_io_md": false, 00:17:52.311 "write_zeroes": true, 00:17:52.311 "zcopy": true, 00:17:52.311 "get_zone_info": false, 00:17:52.311 "zone_management": false, 00:17:52.311 "zone_append": false, 00:17:52.311 "compare": false, 00:17:52.311 "compare_and_write": false, 00:17:52.311 "abort": true, 00:17:52.311 "seek_hole": false, 00:17:52.311 "seek_data": false, 00:17:52.311 "copy": true, 00:17:52.311 "nvme_iov_md": false 00:17:52.311 }, 00:17:52.311 "memory_domains": [ 00:17:52.311 { 00:17:52.311 "dma_device_id": "system", 00:17:52.311 "dma_device_type": 1 00:17:52.311 }, 00:17:52.311 { 00:17:52.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.311 "dma_device_type": 2 00:17:52.311 } 00:17:52.311 ], 00:17:52.311 "driver_specific": {} 00:17:52.311 } 00:17:52.311 ] 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.311 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.579 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.579 "name": "Existed_Raid", 00:17:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.579 "strip_size_kb": 64, 00:17:52.579 "state": "configuring", 00:17:52.579 "raid_level": "raid0", 00:17:52.579 "superblock": false, 00:17:52.579 "num_base_bdevs": 3, 00:17:52.579 "num_base_bdevs_discovered": 2, 00:17:52.579 "num_base_bdevs_operational": 3, 00:17:52.579 "base_bdevs_list": [ 00:17:52.579 { 00:17:52.579 "name": "BaseBdev1", 00:17:52.579 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:52.579 "is_configured": true, 00:17:52.579 "data_offset": 0, 00:17:52.579 "data_size": 65536 00:17:52.579 }, 00:17:52.579 { 00:17:52.579 "name": null, 00:17:52.579 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:52.579 "is_configured": false, 00:17:52.579 "data_offset": 0, 00:17:52.579 "data_size": 65536 00:17:52.579 }, 00:17:52.579 { 00:17:52.579 "name": "BaseBdev3", 00:17:52.579 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:52.579 "is_configured": true, 00:17:52.579 "data_offset": 0, 00:17:52.579 "data_size": 65536 00:17:52.579 } 00:17:52.579 ] 00:17:52.579 }' 00:17:52.579 17:11:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.579 17:11:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.148 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:53.148 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.148 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:53.148 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:53.407 [2024-07-23 17:11:48.732122] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.407 17:11:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.666 17:11:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.666 "name": "Existed_Raid", 00:17:53.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.666 "strip_size_kb": 64, 00:17:53.666 "state": "configuring", 00:17:53.666 "raid_level": "raid0", 00:17:53.666 "superblock": false, 00:17:53.666 "num_base_bdevs": 3, 00:17:53.666 "num_base_bdevs_discovered": 1, 00:17:53.666 "num_base_bdevs_operational": 3, 00:17:53.666 "base_bdevs_list": [ 00:17:53.666 { 00:17:53.666 "name": "BaseBdev1", 00:17:53.666 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:53.666 "is_configured": true, 00:17:53.666 "data_offset": 0, 00:17:53.666 "data_size": 65536 00:17:53.666 }, 00:17:53.666 { 00:17:53.666 "name": null, 00:17:53.666 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:53.666 "is_configured": false, 00:17:53.666 "data_offset": 0, 00:17:53.666 "data_size": 65536 00:17:53.666 }, 00:17:53.667 { 00:17:53.667 "name": null, 00:17:53.667 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:53.667 "is_configured": false, 00:17:53.667 "data_offset": 0, 00:17:53.667 "data_size": 65536 00:17:53.667 } 00:17:53.667 ] 00:17:53.667 }' 00:17:53.667 17:11:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.667 17:11:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.235 17:11:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.235 17:11:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:54.494 17:11:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:54.494 17:11:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:54.753 [2024-07-23 17:11:50.079734] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:54.753 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:54.753 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.754 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.012 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.012 "name": "Existed_Raid", 00:17:55.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.012 "strip_size_kb": 64, 00:17:55.012 "state": "configuring", 00:17:55.012 "raid_level": "raid0", 00:17:55.012 "superblock": false, 00:17:55.012 "num_base_bdevs": 3, 00:17:55.012 "num_base_bdevs_discovered": 2, 00:17:55.012 "num_base_bdevs_operational": 3, 00:17:55.012 "base_bdevs_list": [ 00:17:55.012 { 00:17:55.012 "name": "BaseBdev1", 00:17:55.012 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:55.012 "is_configured": true, 00:17:55.012 "data_offset": 0, 00:17:55.012 "data_size": 65536 00:17:55.012 }, 00:17:55.012 { 00:17:55.012 "name": null, 00:17:55.012 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:55.012 "is_configured": false, 00:17:55.012 "data_offset": 0, 00:17:55.012 "data_size": 65536 00:17:55.013 }, 00:17:55.013 { 00:17:55.013 "name": "BaseBdev3", 00:17:55.013 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:55.013 "is_configured": true, 00:17:55.013 "data_offset": 0, 00:17:55.013 "data_size": 65536 00:17:55.013 } 00:17:55.013 ] 00:17:55.013 }' 00:17:55.013 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.013 17:11:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.581 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.581 17:11:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:55.841 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:55.841 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:56.100 [2024-07-23 17:11:51.423301] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.100 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.359 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.359 "name": "Existed_Raid", 00:17:56.359 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.359 "strip_size_kb": 64, 00:17:56.359 "state": "configuring", 00:17:56.359 "raid_level": "raid0", 00:17:56.359 "superblock": false, 00:17:56.359 "num_base_bdevs": 3, 00:17:56.359 "num_base_bdevs_discovered": 1, 00:17:56.359 "num_base_bdevs_operational": 3, 00:17:56.359 "base_bdevs_list": [ 00:17:56.359 { 00:17:56.359 "name": null, 00:17:56.359 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:56.359 "is_configured": false, 00:17:56.359 "data_offset": 0, 00:17:56.359 "data_size": 65536 00:17:56.359 }, 00:17:56.359 { 00:17:56.359 "name": null, 00:17:56.359 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:56.359 "is_configured": false, 00:17:56.360 "data_offset": 0, 00:17:56.360 "data_size": 65536 00:17:56.360 }, 00:17:56.360 { 00:17:56.360 "name": "BaseBdev3", 00:17:56.360 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:56.360 "is_configured": true, 00:17:56.360 "data_offset": 0, 00:17:56.360 "data_size": 65536 00:17:56.360 } 00:17:56.360 ] 00:17:56.360 }' 00:17:56.360 17:11:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.360 17:11:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.929 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.929 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:57.189 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:57.189 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:57.449 [2024-07-23 17:11:52.675024] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.449 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.708 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.708 "name": "Existed_Raid", 00:17:57.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.708 "strip_size_kb": 64, 00:17:57.708 "state": "configuring", 00:17:57.708 "raid_level": "raid0", 00:17:57.708 "superblock": false, 00:17:57.708 "num_base_bdevs": 3, 00:17:57.708 "num_base_bdevs_discovered": 2, 00:17:57.708 "num_base_bdevs_operational": 3, 00:17:57.708 "base_bdevs_list": [ 00:17:57.708 { 00:17:57.708 "name": null, 00:17:57.708 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:57.708 "is_configured": false, 00:17:57.708 "data_offset": 0, 00:17:57.708 "data_size": 65536 00:17:57.708 }, 00:17:57.708 { 00:17:57.708 "name": "BaseBdev2", 00:17:57.708 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:57.708 "is_configured": true, 00:17:57.708 "data_offset": 0, 00:17:57.708 "data_size": 65536 00:17:57.708 }, 00:17:57.708 { 00:17:57.708 "name": "BaseBdev3", 00:17:57.708 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:57.708 "is_configured": true, 00:17:57.708 "data_offset": 0, 00:17:57.708 "data_size": 65536 00:17:57.708 } 00:17:57.708 ] 00:17:57.708 }' 00:17:57.708 17:11:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.708 17:11:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.276 17:11:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.276 17:11:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:58.534 17:11:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:58.534 17:11:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:58.534 17:11:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.794 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c4ab5f9a-8a91-43b6-9470-5560514557d9 00:17:59.054 [2024-07-23 17:11:54.267720] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:59.054 [2024-07-23 17:11:54.267754] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x100f900 00:17:59.054 [2024-07-23 17:11:54.267762] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:17:59.054 [2024-07-23 17:11:54.267959] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1017ab0 00:17:59.054 [2024-07-23 17:11:54.268072] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x100f900 00:17:59.054 [2024-07-23 17:11:54.268082] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x100f900 00:17:59.054 [2024-07-23 17:11:54.268237] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:59.054 NewBaseBdev 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:59.054 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:59.313 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:59.572 [ 00:17:59.572 { 00:17:59.572 "name": "NewBaseBdev", 00:17:59.572 "aliases": [ 00:17:59.572 "c4ab5f9a-8a91-43b6-9470-5560514557d9" 00:17:59.572 ], 00:17:59.572 "product_name": "Malloc disk", 00:17:59.572 "block_size": 512, 00:17:59.572 "num_blocks": 65536, 00:17:59.573 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:59.573 "assigned_rate_limits": { 00:17:59.573 "rw_ios_per_sec": 0, 00:17:59.573 "rw_mbytes_per_sec": 0, 00:17:59.573 "r_mbytes_per_sec": 0, 00:17:59.573 "w_mbytes_per_sec": 0 00:17:59.573 }, 00:17:59.573 "claimed": true, 00:17:59.573 "claim_type": "exclusive_write", 00:17:59.573 "zoned": false, 00:17:59.573 "supported_io_types": { 00:17:59.573 "read": true, 00:17:59.573 "write": true, 00:17:59.573 "unmap": true, 00:17:59.573 "flush": true, 00:17:59.573 "reset": true, 00:17:59.573 "nvme_admin": false, 00:17:59.573 "nvme_io": false, 00:17:59.573 "nvme_io_md": false, 00:17:59.573 "write_zeroes": true, 00:17:59.573 "zcopy": true, 00:17:59.573 "get_zone_info": false, 00:17:59.573 "zone_management": false, 00:17:59.573 "zone_append": false, 00:17:59.573 "compare": false, 00:17:59.573 "compare_and_write": false, 00:17:59.573 "abort": true, 00:17:59.573 "seek_hole": false, 00:17:59.573 "seek_data": false, 00:17:59.573 "copy": true, 00:17:59.573 "nvme_iov_md": false 00:17:59.573 }, 00:17:59.573 "memory_domains": [ 00:17:59.573 { 00:17:59.573 "dma_device_id": "system", 00:17:59.573 "dma_device_type": 1 00:17:59.573 }, 00:17:59.573 { 00:17:59.573 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.573 "dma_device_type": 2 00:17:59.573 } 00:17:59.573 ], 00:17:59.573 "driver_specific": {} 00:17:59.573 } 00:17:59.573 ] 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.573 17:11:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.832 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.833 "name": "Existed_Raid", 00:17:59.833 "uuid": "1e0ee5c7-c3f4-46d5-9783-99ff786e5ffb", 00:17:59.833 "strip_size_kb": 64, 00:17:59.833 "state": "online", 00:17:59.833 "raid_level": "raid0", 00:17:59.833 "superblock": false, 00:17:59.833 "num_base_bdevs": 3, 00:17:59.833 "num_base_bdevs_discovered": 3, 00:17:59.833 "num_base_bdevs_operational": 3, 00:17:59.833 "base_bdevs_list": [ 00:17:59.833 { 00:17:59.833 "name": "NewBaseBdev", 00:17:59.833 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:17:59.833 "is_configured": true, 00:17:59.833 "data_offset": 0, 00:17:59.833 "data_size": 65536 00:17:59.833 }, 00:17:59.833 { 00:17:59.833 "name": "BaseBdev2", 00:17:59.833 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:17:59.833 "is_configured": true, 00:17:59.833 "data_offset": 0, 00:17:59.833 "data_size": 65536 00:17:59.833 }, 00:17:59.833 { 00:17:59.833 "name": "BaseBdev3", 00:17:59.833 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:17:59.833 "is_configured": true, 00:17:59.833 "data_offset": 0, 00:17:59.833 "data_size": 65536 00:17:59.833 } 00:17:59.833 ] 00:17:59.833 }' 00:17:59.833 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.833 17:11:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:00.403 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:00.662 [2024-07-23 17:11:55.876278] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:00.663 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:00.663 "name": "Existed_Raid", 00:18:00.663 "aliases": [ 00:18:00.663 "1e0ee5c7-c3f4-46d5-9783-99ff786e5ffb" 00:18:00.663 ], 00:18:00.663 "product_name": "Raid Volume", 00:18:00.663 "block_size": 512, 00:18:00.663 "num_blocks": 196608, 00:18:00.663 "uuid": "1e0ee5c7-c3f4-46d5-9783-99ff786e5ffb", 00:18:00.663 "assigned_rate_limits": { 00:18:00.663 "rw_ios_per_sec": 0, 00:18:00.663 "rw_mbytes_per_sec": 0, 00:18:00.663 "r_mbytes_per_sec": 0, 00:18:00.663 "w_mbytes_per_sec": 0 00:18:00.663 }, 00:18:00.663 "claimed": false, 00:18:00.663 "zoned": false, 00:18:00.663 "supported_io_types": { 00:18:00.663 "read": true, 00:18:00.663 "write": true, 00:18:00.663 "unmap": true, 00:18:00.663 "flush": true, 00:18:00.663 "reset": true, 00:18:00.663 "nvme_admin": false, 00:18:00.663 "nvme_io": false, 00:18:00.663 "nvme_io_md": false, 00:18:00.663 "write_zeroes": true, 00:18:00.663 "zcopy": false, 00:18:00.663 "get_zone_info": false, 00:18:00.663 "zone_management": false, 00:18:00.663 "zone_append": false, 00:18:00.663 "compare": false, 00:18:00.663 "compare_and_write": false, 00:18:00.663 "abort": false, 00:18:00.663 "seek_hole": false, 00:18:00.663 "seek_data": false, 00:18:00.663 "copy": false, 00:18:00.663 "nvme_iov_md": false 00:18:00.663 }, 00:18:00.663 "memory_domains": [ 00:18:00.663 { 00:18:00.663 "dma_device_id": "system", 00:18:00.663 "dma_device_type": 1 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.663 "dma_device_type": 2 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "dma_device_id": "system", 00:18:00.663 "dma_device_type": 1 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.663 "dma_device_type": 2 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "dma_device_id": "system", 00:18:00.663 "dma_device_type": 1 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.663 "dma_device_type": 2 00:18:00.663 } 00:18:00.663 ], 00:18:00.663 "driver_specific": { 00:18:00.663 "raid": { 00:18:00.663 "uuid": "1e0ee5c7-c3f4-46d5-9783-99ff786e5ffb", 00:18:00.663 "strip_size_kb": 64, 00:18:00.663 "state": "online", 00:18:00.663 "raid_level": "raid0", 00:18:00.663 "superblock": false, 00:18:00.663 "num_base_bdevs": 3, 00:18:00.663 "num_base_bdevs_discovered": 3, 00:18:00.663 "num_base_bdevs_operational": 3, 00:18:00.663 "base_bdevs_list": [ 00:18:00.663 { 00:18:00.663 "name": "NewBaseBdev", 00:18:00.663 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:18:00.663 "is_configured": true, 00:18:00.663 "data_offset": 0, 00:18:00.663 "data_size": 65536 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "name": "BaseBdev2", 00:18:00.663 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:18:00.663 "is_configured": true, 00:18:00.663 "data_offset": 0, 00:18:00.663 "data_size": 65536 00:18:00.663 }, 00:18:00.663 { 00:18:00.663 "name": "BaseBdev3", 00:18:00.663 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:18:00.663 "is_configured": true, 00:18:00.663 "data_offset": 0, 00:18:00.663 "data_size": 65536 00:18:00.663 } 00:18:00.663 ] 00:18:00.663 } 00:18:00.663 } 00:18:00.663 }' 00:18:00.663 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:00.663 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:00.663 BaseBdev2 00:18:00.663 BaseBdev3' 00:18:00.663 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.663 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:00.663 17:11:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.921 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.921 "name": "NewBaseBdev", 00:18:00.921 "aliases": [ 00:18:00.921 "c4ab5f9a-8a91-43b6-9470-5560514557d9" 00:18:00.921 ], 00:18:00.921 "product_name": "Malloc disk", 00:18:00.921 "block_size": 512, 00:18:00.921 "num_blocks": 65536, 00:18:00.921 "uuid": "c4ab5f9a-8a91-43b6-9470-5560514557d9", 00:18:00.921 "assigned_rate_limits": { 00:18:00.921 "rw_ios_per_sec": 0, 00:18:00.921 "rw_mbytes_per_sec": 0, 00:18:00.921 "r_mbytes_per_sec": 0, 00:18:00.921 "w_mbytes_per_sec": 0 00:18:00.921 }, 00:18:00.921 "claimed": true, 00:18:00.921 "claim_type": "exclusive_write", 00:18:00.921 "zoned": false, 00:18:00.921 "supported_io_types": { 00:18:00.921 "read": true, 00:18:00.921 "write": true, 00:18:00.921 "unmap": true, 00:18:00.921 "flush": true, 00:18:00.921 "reset": true, 00:18:00.921 "nvme_admin": false, 00:18:00.921 "nvme_io": false, 00:18:00.921 "nvme_io_md": false, 00:18:00.921 "write_zeroes": true, 00:18:00.921 "zcopy": true, 00:18:00.921 "get_zone_info": false, 00:18:00.921 "zone_management": false, 00:18:00.921 "zone_append": false, 00:18:00.921 "compare": false, 00:18:00.921 "compare_and_write": false, 00:18:00.921 "abort": true, 00:18:00.921 "seek_hole": false, 00:18:00.921 "seek_data": false, 00:18:00.921 "copy": true, 00:18:00.921 "nvme_iov_md": false 00:18:00.921 }, 00:18:00.921 "memory_domains": [ 00:18:00.921 { 00:18:00.921 "dma_device_id": "system", 00:18:00.921 "dma_device_type": 1 00:18:00.921 }, 00:18:00.921 { 00:18:00.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.921 "dma_device_type": 2 00:18:00.921 } 00:18:00.921 ], 00:18:00.922 "driver_specific": {} 00:18:00.922 }' 00:18:00.922 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.922 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.922 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.922 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.180 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.180 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.180 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.180 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.180 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.180 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.439 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.439 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.439 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.439 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:01.439 17:11:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.007 "name": "BaseBdev2", 00:18:02.007 "aliases": [ 00:18:02.007 "0b0c6dbb-fe5e-4b75-889d-4a1667e04853" 00:18:02.007 ], 00:18:02.007 "product_name": "Malloc disk", 00:18:02.007 "block_size": 512, 00:18:02.007 "num_blocks": 65536, 00:18:02.007 "uuid": "0b0c6dbb-fe5e-4b75-889d-4a1667e04853", 00:18:02.007 "assigned_rate_limits": { 00:18:02.007 "rw_ios_per_sec": 0, 00:18:02.007 "rw_mbytes_per_sec": 0, 00:18:02.007 "r_mbytes_per_sec": 0, 00:18:02.007 "w_mbytes_per_sec": 0 00:18:02.007 }, 00:18:02.007 "claimed": true, 00:18:02.007 "claim_type": "exclusive_write", 00:18:02.007 "zoned": false, 00:18:02.007 "supported_io_types": { 00:18:02.007 "read": true, 00:18:02.007 "write": true, 00:18:02.007 "unmap": true, 00:18:02.007 "flush": true, 00:18:02.007 "reset": true, 00:18:02.007 "nvme_admin": false, 00:18:02.007 "nvme_io": false, 00:18:02.007 "nvme_io_md": false, 00:18:02.007 "write_zeroes": true, 00:18:02.007 "zcopy": true, 00:18:02.007 "get_zone_info": false, 00:18:02.007 "zone_management": false, 00:18:02.007 "zone_append": false, 00:18:02.007 "compare": false, 00:18:02.007 "compare_and_write": false, 00:18:02.007 "abort": true, 00:18:02.007 "seek_hole": false, 00:18:02.007 "seek_data": false, 00:18:02.007 "copy": true, 00:18:02.007 "nvme_iov_md": false 00:18:02.007 }, 00:18:02.007 "memory_domains": [ 00:18:02.007 { 00:18:02.007 "dma_device_id": "system", 00:18:02.007 "dma_device_type": 1 00:18:02.007 }, 00:18:02.007 { 00:18:02.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.007 "dma_device_type": 2 00:18:02.007 } 00:18:02.007 ], 00:18:02.007 "driver_specific": {} 00:18:02.007 }' 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.007 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.266 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.266 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.266 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.266 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.266 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.266 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.267 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:02.267 17:11:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.836 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.836 "name": "BaseBdev3", 00:18:02.836 "aliases": [ 00:18:02.836 "10a6c59a-6b7f-4a6d-8dad-3f7966710d02" 00:18:02.836 ], 00:18:02.836 "product_name": "Malloc disk", 00:18:02.836 "block_size": 512, 00:18:02.836 "num_blocks": 65536, 00:18:02.836 "uuid": "10a6c59a-6b7f-4a6d-8dad-3f7966710d02", 00:18:02.836 "assigned_rate_limits": { 00:18:02.836 "rw_ios_per_sec": 0, 00:18:02.836 "rw_mbytes_per_sec": 0, 00:18:02.836 "r_mbytes_per_sec": 0, 00:18:02.836 "w_mbytes_per_sec": 0 00:18:02.836 }, 00:18:02.836 "claimed": true, 00:18:02.836 "claim_type": "exclusive_write", 00:18:02.836 "zoned": false, 00:18:02.836 "supported_io_types": { 00:18:02.836 "read": true, 00:18:02.836 "write": true, 00:18:02.836 "unmap": true, 00:18:02.836 "flush": true, 00:18:02.836 "reset": true, 00:18:02.836 "nvme_admin": false, 00:18:02.836 "nvme_io": false, 00:18:02.836 "nvme_io_md": false, 00:18:02.836 "write_zeroes": true, 00:18:02.836 "zcopy": true, 00:18:02.836 "get_zone_info": false, 00:18:02.836 "zone_management": false, 00:18:02.836 "zone_append": false, 00:18:02.836 "compare": false, 00:18:02.836 "compare_and_write": false, 00:18:02.836 "abort": true, 00:18:02.836 "seek_hole": false, 00:18:02.836 "seek_data": false, 00:18:02.836 "copy": true, 00:18:02.836 "nvme_iov_md": false 00:18:02.836 }, 00:18:02.836 "memory_domains": [ 00:18:02.836 { 00:18:02.836 "dma_device_id": "system", 00:18:02.836 "dma_device_type": 1 00:18:02.836 }, 00:18:02.836 { 00:18:02.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.836 "dma_device_type": 2 00:18:02.836 } 00:18:02.836 ], 00:18:02.836 "driver_specific": {} 00:18:02.836 }' 00:18:02.836 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.836 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.095 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.354 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.354 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:03.354 [2024-07-23 17:11:58.755660] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:03.354 [2024-07-23 17:11:58.755686] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:03.354 [2024-07-23 17:11:58.755735] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:03.354 [2024-07-23 17:11:58.755783] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:03.354 [2024-07-23 17:11:58.755794] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x100f900 name Existed_Raid, state offline 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4128913 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4128913 ']' 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4128913 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4128913 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4128913' 00:18:03.614 killing process with pid 4128913 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4128913 00:18:03.614 [2024-07-23 17:11:58.840939] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:03.614 17:11:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4128913 00:18:03.614 [2024-07-23 17:11:58.868480] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:03.874 00:18:03.874 real 0m30.572s 00:18:03.874 user 0m56.495s 00:18:03.874 sys 0m5.508s 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.874 ************************************ 00:18:03.874 END TEST raid_state_function_test 00:18:03.874 ************************************ 00:18:03.874 17:11:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:03.874 17:11:59 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:18:03.874 17:11:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:03.874 17:11:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:03.874 17:11:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:03.874 ************************************ 00:18:03.874 START TEST raid_state_function_test_sb 00:18:03.874 ************************************ 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4133389 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4133389' 00:18:03.874 Process raid pid: 4133389 00:18:03.874 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4133389 /var/tmp/spdk-raid.sock 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4133389 ']' 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:03.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:03.875 17:11:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:03.875 [2024-07-23 17:11:59.236864] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:18:03.875 [2024-07-23 17:11:59.236960] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:04.135 [2024-07-23 17:11:59.372930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:04.135 [2024-07-23 17:11:59.428150] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:04.135 [2024-07-23 17:11:59.488819] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:04.135 [2024-07-23 17:11:59.488851] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:05.073 [2024-07-23 17:12:00.355620] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:05.073 [2024-07-23 17:12:00.355666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:05.073 [2024-07-23 17:12:00.355677] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:05.073 [2024-07-23 17:12:00.355689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:05.073 [2024-07-23 17:12:00.355698] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:05.073 [2024-07-23 17:12:00.355709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.073 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.332 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.332 "name": "Existed_Raid", 00:18:05.332 "uuid": "be6600fd-56fc-44e7-a99e-8c32dd46961f", 00:18:05.332 "strip_size_kb": 64, 00:18:05.332 "state": "configuring", 00:18:05.332 "raid_level": "raid0", 00:18:05.332 "superblock": true, 00:18:05.332 "num_base_bdevs": 3, 00:18:05.332 "num_base_bdevs_discovered": 0, 00:18:05.332 "num_base_bdevs_operational": 3, 00:18:05.332 "base_bdevs_list": [ 00:18:05.332 { 00:18:05.332 "name": "BaseBdev1", 00:18:05.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.332 "is_configured": false, 00:18:05.332 "data_offset": 0, 00:18:05.332 "data_size": 0 00:18:05.332 }, 00:18:05.332 { 00:18:05.332 "name": "BaseBdev2", 00:18:05.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.333 "is_configured": false, 00:18:05.333 "data_offset": 0, 00:18:05.333 "data_size": 0 00:18:05.333 }, 00:18:05.333 { 00:18:05.333 "name": "BaseBdev3", 00:18:05.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.333 "is_configured": false, 00:18:05.333 "data_offset": 0, 00:18:05.333 "data_size": 0 00:18:05.333 } 00:18:05.333 ] 00:18:05.333 }' 00:18:05.333 17:12:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.333 17:12:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.268 17:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:06.526 [2024-07-23 17:12:01.763188] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:06.526 [2024-07-23 17:12:01.763221] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1be3280 name Existed_Raid, state configuring 00:18:06.526 17:12:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:06.792 [2024-07-23 17:12:02.023900] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:06.792 [2024-07-23 17:12:02.023930] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:06.792 [2024-07-23 17:12:02.023941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:06.792 [2024-07-23 17:12:02.023952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:06.792 [2024-07-23 17:12:02.023961] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:06.792 [2024-07-23 17:12:02.023972] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:06.792 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:07.117 [2024-07-23 17:12:02.294398] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:07.117 BaseBdev1 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:07.117 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.375 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:07.375 [ 00:18:07.375 { 00:18:07.375 "name": "BaseBdev1", 00:18:07.375 "aliases": [ 00:18:07.375 "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f" 00:18:07.375 ], 00:18:07.375 "product_name": "Malloc disk", 00:18:07.375 "block_size": 512, 00:18:07.375 "num_blocks": 65536, 00:18:07.375 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:07.375 "assigned_rate_limits": { 00:18:07.375 "rw_ios_per_sec": 0, 00:18:07.375 "rw_mbytes_per_sec": 0, 00:18:07.375 "r_mbytes_per_sec": 0, 00:18:07.375 "w_mbytes_per_sec": 0 00:18:07.375 }, 00:18:07.375 "claimed": true, 00:18:07.375 "claim_type": "exclusive_write", 00:18:07.375 "zoned": false, 00:18:07.375 "supported_io_types": { 00:18:07.375 "read": true, 00:18:07.375 "write": true, 00:18:07.375 "unmap": true, 00:18:07.375 "flush": true, 00:18:07.375 "reset": true, 00:18:07.375 "nvme_admin": false, 00:18:07.375 "nvme_io": false, 00:18:07.375 "nvme_io_md": false, 00:18:07.375 "write_zeroes": true, 00:18:07.375 "zcopy": true, 00:18:07.375 "get_zone_info": false, 00:18:07.375 "zone_management": false, 00:18:07.375 "zone_append": false, 00:18:07.375 "compare": false, 00:18:07.375 "compare_and_write": false, 00:18:07.375 "abort": true, 00:18:07.375 "seek_hole": false, 00:18:07.375 "seek_data": false, 00:18:07.375 "copy": true, 00:18:07.375 "nvme_iov_md": false 00:18:07.375 }, 00:18:07.375 "memory_domains": [ 00:18:07.375 { 00:18:07.375 "dma_device_id": "system", 00:18:07.375 "dma_device_type": 1 00:18:07.375 }, 00:18:07.375 { 00:18:07.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.375 "dma_device_type": 2 00:18:07.375 } 00:18:07.375 ], 00:18:07.375 "driver_specific": {} 00:18:07.375 } 00:18:07.375 ] 00:18:07.633 17:12:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:07.633 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:07.633 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.634 17:12:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:07.892 17:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.892 "name": "Existed_Raid", 00:18:07.892 "uuid": "41849144-f635-4a95-a8ca-26fb3d205a76", 00:18:07.892 "strip_size_kb": 64, 00:18:07.892 "state": "configuring", 00:18:07.892 "raid_level": "raid0", 00:18:07.892 "superblock": true, 00:18:07.892 "num_base_bdevs": 3, 00:18:07.892 "num_base_bdevs_discovered": 1, 00:18:07.892 "num_base_bdevs_operational": 3, 00:18:07.892 "base_bdevs_list": [ 00:18:07.892 { 00:18:07.892 "name": "BaseBdev1", 00:18:07.892 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:07.892 "is_configured": true, 00:18:07.892 "data_offset": 2048, 00:18:07.892 "data_size": 63488 00:18:07.892 }, 00:18:07.892 { 00:18:07.892 "name": "BaseBdev2", 00:18:07.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.892 "is_configured": false, 00:18:07.892 "data_offset": 0, 00:18:07.892 "data_size": 0 00:18:07.892 }, 00:18:07.892 { 00:18:07.892 "name": "BaseBdev3", 00:18:07.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.892 "is_configured": false, 00:18:07.892 "data_offset": 0, 00:18:07.892 "data_size": 0 00:18:07.892 } 00:18:07.892 ] 00:18:07.892 }' 00:18:07.892 17:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.892 17:12:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.458 17:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:08.458 [2024-07-23 17:12:03.878591] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:08.458 [2024-07-23 17:12:03.878632] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1be2bb0 name Existed_Raid, state configuring 00:18:08.716 17:12:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:08.975 [2024-07-23 17:12:04.379971] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:08.975 [2024-07-23 17:12:04.381465] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:08.975 [2024-07-23 17:12:04.381500] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:08.975 [2024-07-23 17:12:04.381511] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:08.975 [2024-07-23 17:12:04.381522] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.234 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.494 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.494 "name": "Existed_Raid", 00:18:09.494 "uuid": "e7ea646e-bac6-4708-bec7-42062c598c81", 00:18:09.494 "strip_size_kb": 64, 00:18:09.494 "state": "configuring", 00:18:09.494 "raid_level": "raid0", 00:18:09.494 "superblock": true, 00:18:09.494 "num_base_bdevs": 3, 00:18:09.494 "num_base_bdevs_discovered": 1, 00:18:09.494 "num_base_bdevs_operational": 3, 00:18:09.494 "base_bdevs_list": [ 00:18:09.494 { 00:18:09.494 "name": "BaseBdev1", 00:18:09.494 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:09.494 "is_configured": true, 00:18:09.494 "data_offset": 2048, 00:18:09.494 "data_size": 63488 00:18:09.494 }, 00:18:09.494 { 00:18:09.494 "name": "BaseBdev2", 00:18:09.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.494 "is_configured": false, 00:18:09.494 "data_offset": 0, 00:18:09.494 "data_size": 0 00:18:09.494 }, 00:18:09.494 { 00:18:09.494 "name": "BaseBdev3", 00:18:09.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.494 "is_configured": false, 00:18:09.494 "data_offset": 0, 00:18:09.494 "data_size": 0 00:18:09.494 } 00:18:09.494 ] 00:18:09.494 }' 00:18:09.494 17:12:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.494 17:12:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.062 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:10.062 [2024-07-23 17:12:05.466188] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:10.062 BaseBdev2 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:10.321 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:10.580 [ 00:18:10.580 { 00:18:10.580 "name": "BaseBdev2", 00:18:10.580 "aliases": [ 00:18:10.580 "4029d350-6123-47e0-95af-5c6b88c017d2" 00:18:10.580 ], 00:18:10.580 "product_name": "Malloc disk", 00:18:10.580 "block_size": 512, 00:18:10.580 "num_blocks": 65536, 00:18:10.580 "uuid": "4029d350-6123-47e0-95af-5c6b88c017d2", 00:18:10.580 "assigned_rate_limits": { 00:18:10.580 "rw_ios_per_sec": 0, 00:18:10.580 "rw_mbytes_per_sec": 0, 00:18:10.580 "r_mbytes_per_sec": 0, 00:18:10.580 "w_mbytes_per_sec": 0 00:18:10.580 }, 00:18:10.580 "claimed": true, 00:18:10.580 "claim_type": "exclusive_write", 00:18:10.580 "zoned": false, 00:18:10.580 "supported_io_types": { 00:18:10.580 "read": true, 00:18:10.580 "write": true, 00:18:10.580 "unmap": true, 00:18:10.580 "flush": true, 00:18:10.580 "reset": true, 00:18:10.580 "nvme_admin": false, 00:18:10.580 "nvme_io": false, 00:18:10.580 "nvme_io_md": false, 00:18:10.580 "write_zeroes": true, 00:18:10.580 "zcopy": true, 00:18:10.580 "get_zone_info": false, 00:18:10.580 "zone_management": false, 00:18:10.580 "zone_append": false, 00:18:10.580 "compare": false, 00:18:10.580 "compare_and_write": false, 00:18:10.580 "abort": true, 00:18:10.580 "seek_hole": false, 00:18:10.580 "seek_data": false, 00:18:10.580 "copy": true, 00:18:10.580 "nvme_iov_md": false 00:18:10.580 }, 00:18:10.580 "memory_domains": [ 00:18:10.580 { 00:18:10.580 "dma_device_id": "system", 00:18:10.580 "dma_device_type": 1 00:18:10.580 }, 00:18:10.580 { 00:18:10.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.580 "dma_device_type": 2 00:18:10.580 } 00:18:10.580 ], 00:18:10.580 "driver_specific": {} 00:18:10.580 } 00:18:10.580 ] 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.580 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.581 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.581 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.581 17:12:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.840 17:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.840 "name": "Existed_Raid", 00:18:10.840 "uuid": "e7ea646e-bac6-4708-bec7-42062c598c81", 00:18:10.840 "strip_size_kb": 64, 00:18:10.840 "state": "configuring", 00:18:10.840 "raid_level": "raid0", 00:18:10.840 "superblock": true, 00:18:10.840 "num_base_bdevs": 3, 00:18:10.840 "num_base_bdevs_discovered": 2, 00:18:10.840 "num_base_bdevs_operational": 3, 00:18:10.840 "base_bdevs_list": [ 00:18:10.840 { 00:18:10.840 "name": "BaseBdev1", 00:18:10.840 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:10.840 "is_configured": true, 00:18:10.840 "data_offset": 2048, 00:18:10.840 "data_size": 63488 00:18:10.840 }, 00:18:10.840 { 00:18:10.840 "name": "BaseBdev2", 00:18:10.840 "uuid": "4029d350-6123-47e0-95af-5c6b88c017d2", 00:18:10.840 "is_configured": true, 00:18:10.840 "data_offset": 2048, 00:18:10.840 "data_size": 63488 00:18:10.840 }, 00:18:10.840 { 00:18:10.840 "name": "BaseBdev3", 00:18:10.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.840 "is_configured": false, 00:18:10.840 "data_offset": 0, 00:18:10.840 "data_size": 0 00:18:10.840 } 00:18:10.840 ] 00:18:10.840 }' 00:18:10.840 17:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.840 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.408 17:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:11.668 [2024-07-23 17:12:06.938650] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:11.668 [2024-07-23 17:12:06.938819] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1be2800 00:18:11.668 [2024-07-23 17:12:06.938833] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:11.668 [2024-07-23 17:12:06.939028] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1be6b50 00:18:11.668 [2024-07-23 17:12:06.939152] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1be2800 00:18:11.668 [2024-07-23 17:12:06.939162] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1be2800 00:18:11.668 [2024-07-23 17:12:06.939254] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:11.668 BaseBdev3 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:11.668 17:12:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:11.927 17:12:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:12.496 [ 00:18:12.496 { 00:18:12.496 "name": "BaseBdev3", 00:18:12.496 "aliases": [ 00:18:12.496 "354c4de5-698b-422f-a76e-6ed40a8e993f" 00:18:12.496 ], 00:18:12.496 "product_name": "Malloc disk", 00:18:12.496 "block_size": 512, 00:18:12.496 "num_blocks": 65536, 00:18:12.496 "uuid": "354c4de5-698b-422f-a76e-6ed40a8e993f", 00:18:12.496 "assigned_rate_limits": { 00:18:12.496 "rw_ios_per_sec": 0, 00:18:12.496 "rw_mbytes_per_sec": 0, 00:18:12.496 "r_mbytes_per_sec": 0, 00:18:12.496 "w_mbytes_per_sec": 0 00:18:12.496 }, 00:18:12.496 "claimed": true, 00:18:12.496 "claim_type": "exclusive_write", 00:18:12.496 "zoned": false, 00:18:12.496 "supported_io_types": { 00:18:12.496 "read": true, 00:18:12.496 "write": true, 00:18:12.496 "unmap": true, 00:18:12.496 "flush": true, 00:18:12.496 "reset": true, 00:18:12.496 "nvme_admin": false, 00:18:12.496 "nvme_io": false, 00:18:12.496 "nvme_io_md": false, 00:18:12.496 "write_zeroes": true, 00:18:12.496 "zcopy": true, 00:18:12.496 "get_zone_info": false, 00:18:12.496 "zone_management": false, 00:18:12.496 "zone_append": false, 00:18:12.496 "compare": false, 00:18:12.496 "compare_and_write": false, 00:18:12.496 "abort": true, 00:18:12.496 "seek_hole": false, 00:18:12.496 "seek_data": false, 00:18:12.496 "copy": true, 00:18:12.496 "nvme_iov_md": false 00:18:12.496 }, 00:18:12.496 "memory_domains": [ 00:18:12.496 { 00:18:12.496 "dma_device_id": "system", 00:18:12.496 "dma_device_type": 1 00:18:12.496 }, 00:18:12.496 { 00:18:12.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.496 "dma_device_type": 2 00:18:12.496 } 00:18:12.496 ], 00:18:12.496 "driver_specific": {} 00:18:12.496 } 00:18:12.496 ] 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.496 17:12:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.755 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.755 "name": "Existed_Raid", 00:18:12.755 "uuid": "e7ea646e-bac6-4708-bec7-42062c598c81", 00:18:12.755 "strip_size_kb": 64, 00:18:12.755 "state": "online", 00:18:12.755 "raid_level": "raid0", 00:18:12.755 "superblock": true, 00:18:12.755 "num_base_bdevs": 3, 00:18:12.755 "num_base_bdevs_discovered": 3, 00:18:12.755 "num_base_bdevs_operational": 3, 00:18:12.755 "base_bdevs_list": [ 00:18:12.755 { 00:18:12.755 "name": "BaseBdev1", 00:18:12.755 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:12.755 "is_configured": true, 00:18:12.755 "data_offset": 2048, 00:18:12.755 "data_size": 63488 00:18:12.755 }, 00:18:12.755 { 00:18:12.755 "name": "BaseBdev2", 00:18:12.755 "uuid": "4029d350-6123-47e0-95af-5c6b88c017d2", 00:18:12.755 "is_configured": true, 00:18:12.755 "data_offset": 2048, 00:18:12.755 "data_size": 63488 00:18:12.755 }, 00:18:12.755 { 00:18:12.755 "name": "BaseBdev3", 00:18:12.755 "uuid": "354c4de5-698b-422f-a76e-6ed40a8e993f", 00:18:12.755 "is_configured": true, 00:18:12.755 "data_offset": 2048, 00:18:12.755 "data_size": 63488 00:18:12.755 } 00:18:12.755 ] 00:18:12.755 }' 00:18:12.755 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.755 17:12:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:13.323 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:13.583 [2024-07-23 17:12:08.835980] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:13.583 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:13.583 "name": "Existed_Raid", 00:18:13.583 "aliases": [ 00:18:13.583 "e7ea646e-bac6-4708-bec7-42062c598c81" 00:18:13.583 ], 00:18:13.583 "product_name": "Raid Volume", 00:18:13.583 "block_size": 512, 00:18:13.583 "num_blocks": 190464, 00:18:13.583 "uuid": "e7ea646e-bac6-4708-bec7-42062c598c81", 00:18:13.583 "assigned_rate_limits": { 00:18:13.583 "rw_ios_per_sec": 0, 00:18:13.583 "rw_mbytes_per_sec": 0, 00:18:13.583 "r_mbytes_per_sec": 0, 00:18:13.583 "w_mbytes_per_sec": 0 00:18:13.583 }, 00:18:13.583 "claimed": false, 00:18:13.583 "zoned": false, 00:18:13.583 "supported_io_types": { 00:18:13.583 "read": true, 00:18:13.583 "write": true, 00:18:13.583 "unmap": true, 00:18:13.583 "flush": true, 00:18:13.583 "reset": true, 00:18:13.583 "nvme_admin": false, 00:18:13.583 "nvme_io": false, 00:18:13.583 "nvme_io_md": false, 00:18:13.583 "write_zeroes": true, 00:18:13.583 "zcopy": false, 00:18:13.583 "get_zone_info": false, 00:18:13.583 "zone_management": false, 00:18:13.583 "zone_append": false, 00:18:13.583 "compare": false, 00:18:13.583 "compare_and_write": false, 00:18:13.583 "abort": false, 00:18:13.583 "seek_hole": false, 00:18:13.583 "seek_data": false, 00:18:13.583 "copy": false, 00:18:13.583 "nvme_iov_md": false 00:18:13.583 }, 00:18:13.583 "memory_domains": [ 00:18:13.583 { 00:18:13.583 "dma_device_id": "system", 00:18:13.583 "dma_device_type": 1 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.583 "dma_device_type": 2 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "dma_device_id": "system", 00:18:13.583 "dma_device_type": 1 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.583 "dma_device_type": 2 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "dma_device_id": "system", 00:18:13.583 "dma_device_type": 1 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.583 "dma_device_type": 2 00:18:13.583 } 00:18:13.583 ], 00:18:13.583 "driver_specific": { 00:18:13.583 "raid": { 00:18:13.583 "uuid": "e7ea646e-bac6-4708-bec7-42062c598c81", 00:18:13.583 "strip_size_kb": 64, 00:18:13.583 "state": "online", 00:18:13.583 "raid_level": "raid0", 00:18:13.583 "superblock": true, 00:18:13.583 "num_base_bdevs": 3, 00:18:13.583 "num_base_bdevs_discovered": 3, 00:18:13.583 "num_base_bdevs_operational": 3, 00:18:13.583 "base_bdevs_list": [ 00:18:13.583 { 00:18:13.583 "name": "BaseBdev1", 00:18:13.583 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:13.583 "is_configured": true, 00:18:13.583 "data_offset": 2048, 00:18:13.583 "data_size": 63488 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "name": "BaseBdev2", 00:18:13.583 "uuid": "4029d350-6123-47e0-95af-5c6b88c017d2", 00:18:13.583 "is_configured": true, 00:18:13.583 "data_offset": 2048, 00:18:13.583 "data_size": 63488 00:18:13.583 }, 00:18:13.583 { 00:18:13.583 "name": "BaseBdev3", 00:18:13.583 "uuid": "354c4de5-698b-422f-a76e-6ed40a8e993f", 00:18:13.583 "is_configured": true, 00:18:13.583 "data_offset": 2048, 00:18:13.583 "data_size": 63488 00:18:13.583 } 00:18:13.583 ] 00:18:13.583 } 00:18:13.583 } 00:18:13.583 }' 00:18:13.583 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:13.583 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:13.583 BaseBdev2 00:18:13.583 BaseBdev3' 00:18:13.583 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.583 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:13.583 17:12:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:13.842 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:13.842 "name": "BaseBdev1", 00:18:13.842 "aliases": [ 00:18:13.842 "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f" 00:18:13.842 ], 00:18:13.842 "product_name": "Malloc disk", 00:18:13.842 "block_size": 512, 00:18:13.842 "num_blocks": 65536, 00:18:13.842 "uuid": "53e04584-11cd-4c31-a8ec-3a4ec0d07a5f", 00:18:13.842 "assigned_rate_limits": { 00:18:13.842 "rw_ios_per_sec": 0, 00:18:13.842 "rw_mbytes_per_sec": 0, 00:18:13.842 "r_mbytes_per_sec": 0, 00:18:13.842 "w_mbytes_per_sec": 0 00:18:13.842 }, 00:18:13.842 "claimed": true, 00:18:13.842 "claim_type": "exclusive_write", 00:18:13.842 "zoned": false, 00:18:13.842 "supported_io_types": { 00:18:13.842 "read": true, 00:18:13.842 "write": true, 00:18:13.842 "unmap": true, 00:18:13.842 "flush": true, 00:18:13.842 "reset": true, 00:18:13.842 "nvme_admin": false, 00:18:13.842 "nvme_io": false, 00:18:13.842 "nvme_io_md": false, 00:18:13.842 "write_zeroes": true, 00:18:13.842 "zcopy": true, 00:18:13.842 "get_zone_info": false, 00:18:13.842 "zone_management": false, 00:18:13.842 "zone_append": false, 00:18:13.842 "compare": false, 00:18:13.842 "compare_and_write": false, 00:18:13.842 "abort": true, 00:18:13.842 "seek_hole": false, 00:18:13.842 "seek_data": false, 00:18:13.842 "copy": true, 00:18:13.842 "nvme_iov_md": false 00:18:13.842 }, 00:18:13.842 "memory_domains": [ 00:18:13.842 { 00:18:13.842 "dma_device_id": "system", 00:18:13.842 "dma_device_type": 1 00:18:13.842 }, 00:18:13.842 { 00:18:13.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.842 "dma_device_type": 2 00:18:13.842 } 00:18:13.842 ], 00:18:13.842 "driver_specific": {} 00:18:13.842 }' 00:18:13.842 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.842 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:13.842 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:13.842 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.102 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.102 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:14.103 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.363 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:14.363 "name": "BaseBdev2", 00:18:14.363 "aliases": [ 00:18:14.363 "4029d350-6123-47e0-95af-5c6b88c017d2" 00:18:14.363 ], 00:18:14.363 "product_name": "Malloc disk", 00:18:14.363 "block_size": 512, 00:18:14.363 "num_blocks": 65536, 00:18:14.363 "uuid": "4029d350-6123-47e0-95af-5c6b88c017d2", 00:18:14.363 "assigned_rate_limits": { 00:18:14.363 "rw_ios_per_sec": 0, 00:18:14.363 "rw_mbytes_per_sec": 0, 00:18:14.363 "r_mbytes_per_sec": 0, 00:18:14.363 "w_mbytes_per_sec": 0 00:18:14.363 }, 00:18:14.363 "claimed": true, 00:18:14.363 "claim_type": "exclusive_write", 00:18:14.363 "zoned": false, 00:18:14.363 "supported_io_types": { 00:18:14.363 "read": true, 00:18:14.363 "write": true, 00:18:14.363 "unmap": true, 00:18:14.363 "flush": true, 00:18:14.363 "reset": true, 00:18:14.363 "nvme_admin": false, 00:18:14.363 "nvme_io": false, 00:18:14.363 "nvme_io_md": false, 00:18:14.363 "write_zeroes": true, 00:18:14.363 "zcopy": true, 00:18:14.363 "get_zone_info": false, 00:18:14.363 "zone_management": false, 00:18:14.363 "zone_append": false, 00:18:14.363 "compare": false, 00:18:14.363 "compare_and_write": false, 00:18:14.363 "abort": true, 00:18:14.363 "seek_hole": false, 00:18:14.363 "seek_data": false, 00:18:14.363 "copy": true, 00:18:14.363 "nvme_iov_md": false 00:18:14.363 }, 00:18:14.363 "memory_domains": [ 00:18:14.363 { 00:18:14.363 "dma_device_id": "system", 00:18:14.363 "dma_device_type": 1 00:18:14.363 }, 00:18:14.363 { 00:18:14.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.363 "dma_device_type": 2 00:18:14.363 } 00:18:14.363 ], 00:18:14.363 "driver_specific": {} 00:18:14.363 }' 00:18:14.363 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.622 17:12:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.622 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.622 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.881 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.881 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.881 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:14.881 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.881 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.140 "name": "BaseBdev3", 00:18:15.140 "aliases": [ 00:18:15.140 "354c4de5-698b-422f-a76e-6ed40a8e993f" 00:18:15.140 ], 00:18:15.140 "product_name": "Malloc disk", 00:18:15.140 "block_size": 512, 00:18:15.140 "num_blocks": 65536, 00:18:15.140 "uuid": "354c4de5-698b-422f-a76e-6ed40a8e993f", 00:18:15.140 "assigned_rate_limits": { 00:18:15.140 "rw_ios_per_sec": 0, 00:18:15.140 "rw_mbytes_per_sec": 0, 00:18:15.140 "r_mbytes_per_sec": 0, 00:18:15.140 "w_mbytes_per_sec": 0 00:18:15.140 }, 00:18:15.140 "claimed": true, 00:18:15.140 "claim_type": "exclusive_write", 00:18:15.140 "zoned": false, 00:18:15.140 "supported_io_types": { 00:18:15.140 "read": true, 00:18:15.140 "write": true, 00:18:15.140 "unmap": true, 00:18:15.140 "flush": true, 00:18:15.140 "reset": true, 00:18:15.140 "nvme_admin": false, 00:18:15.140 "nvme_io": false, 00:18:15.140 "nvme_io_md": false, 00:18:15.140 "write_zeroes": true, 00:18:15.140 "zcopy": true, 00:18:15.140 "get_zone_info": false, 00:18:15.140 "zone_management": false, 00:18:15.140 "zone_append": false, 00:18:15.140 "compare": false, 00:18:15.140 "compare_and_write": false, 00:18:15.140 "abort": true, 00:18:15.140 "seek_hole": false, 00:18:15.140 "seek_data": false, 00:18:15.140 "copy": true, 00:18:15.140 "nvme_iov_md": false 00:18:15.140 }, 00:18:15.140 "memory_domains": [ 00:18:15.140 { 00:18:15.140 "dma_device_id": "system", 00:18:15.140 "dma_device_type": 1 00:18:15.140 }, 00:18:15.140 { 00:18:15.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.140 "dma_device_type": 2 00:18:15.140 } 00:18:15.140 ], 00:18:15.140 "driver_specific": {} 00:18:15.140 }' 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.140 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.399 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.399 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.399 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.399 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.399 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.399 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:15.659 [2024-07-23 17:12:10.945327] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:15.659 [2024-07-23 17:12:10.945354] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:15.659 [2024-07-23 17:12:10.945397] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.659 17:12:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.918 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.918 "name": "Existed_Raid", 00:18:15.918 "uuid": "e7ea646e-bac6-4708-bec7-42062c598c81", 00:18:15.918 "strip_size_kb": 64, 00:18:15.918 "state": "offline", 00:18:15.918 "raid_level": "raid0", 00:18:15.918 "superblock": true, 00:18:15.918 "num_base_bdevs": 3, 00:18:15.918 "num_base_bdevs_discovered": 2, 00:18:15.918 "num_base_bdevs_operational": 2, 00:18:15.918 "base_bdevs_list": [ 00:18:15.918 { 00:18:15.918 "name": null, 00:18:15.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.918 "is_configured": false, 00:18:15.918 "data_offset": 2048, 00:18:15.918 "data_size": 63488 00:18:15.918 }, 00:18:15.918 { 00:18:15.918 "name": "BaseBdev2", 00:18:15.918 "uuid": "4029d350-6123-47e0-95af-5c6b88c017d2", 00:18:15.918 "is_configured": true, 00:18:15.918 "data_offset": 2048, 00:18:15.918 "data_size": 63488 00:18:15.918 }, 00:18:15.918 { 00:18:15.918 "name": "BaseBdev3", 00:18:15.918 "uuid": "354c4de5-698b-422f-a76e-6ed40a8e993f", 00:18:15.918 "is_configured": true, 00:18:15.918 "data_offset": 2048, 00:18:15.918 "data_size": 63488 00:18:15.918 } 00:18:15.918 ] 00:18:15.918 }' 00:18:15.918 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.918 17:12:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:16.486 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:16.486 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:16.486 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.486 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:16.744 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:16.744 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:16.744 17:12:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:17.003 [2024-07-23 17:12:12.418314] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:17.265 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:17.265 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:17.265 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.265 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:17.524 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:17.524 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:17.524 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:17.524 [2024-07-23 17:12:12.935889] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:17.524 [2024-07-23 17:12:12.935941] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1be2800 name Existed_Raid, state offline 00:18:17.783 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:17.783 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:17.783 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.783 17:12:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:18.042 BaseBdev2 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:18.042 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:18.301 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:18.560 [ 00:18:18.560 { 00:18:18.560 "name": "BaseBdev2", 00:18:18.560 "aliases": [ 00:18:18.560 "abdb858d-50de-4a42-898f-3251fad6bacd" 00:18:18.560 ], 00:18:18.560 "product_name": "Malloc disk", 00:18:18.560 "block_size": 512, 00:18:18.560 "num_blocks": 65536, 00:18:18.560 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:18.560 "assigned_rate_limits": { 00:18:18.560 "rw_ios_per_sec": 0, 00:18:18.560 "rw_mbytes_per_sec": 0, 00:18:18.560 "r_mbytes_per_sec": 0, 00:18:18.560 "w_mbytes_per_sec": 0 00:18:18.560 }, 00:18:18.560 "claimed": false, 00:18:18.560 "zoned": false, 00:18:18.560 "supported_io_types": { 00:18:18.560 "read": true, 00:18:18.560 "write": true, 00:18:18.560 "unmap": true, 00:18:18.560 "flush": true, 00:18:18.560 "reset": true, 00:18:18.560 "nvme_admin": false, 00:18:18.560 "nvme_io": false, 00:18:18.560 "nvme_io_md": false, 00:18:18.560 "write_zeroes": true, 00:18:18.560 "zcopy": true, 00:18:18.560 "get_zone_info": false, 00:18:18.560 "zone_management": false, 00:18:18.560 "zone_append": false, 00:18:18.560 "compare": false, 00:18:18.560 "compare_and_write": false, 00:18:18.560 "abort": true, 00:18:18.560 "seek_hole": false, 00:18:18.560 "seek_data": false, 00:18:18.560 "copy": true, 00:18:18.560 "nvme_iov_md": false 00:18:18.560 }, 00:18:18.560 "memory_domains": [ 00:18:18.560 { 00:18:18.560 "dma_device_id": "system", 00:18:18.560 "dma_device_type": 1 00:18:18.560 }, 00:18:18.560 { 00:18:18.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.560 "dma_device_type": 2 00:18:18.560 } 00:18:18.560 ], 00:18:18.560 "driver_specific": {} 00:18:18.560 } 00:18:18.560 ] 00:18:18.560 17:12:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:18.560 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:18.560 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:18.560 17:12:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:18.821 BaseBdev3 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:18.821 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:19.081 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:19.341 [ 00:18:19.341 { 00:18:19.341 "name": "BaseBdev3", 00:18:19.341 "aliases": [ 00:18:19.341 "d0bfa408-5d5c-4d5e-ac43-65f952b698d0" 00:18:19.341 ], 00:18:19.341 "product_name": "Malloc disk", 00:18:19.341 "block_size": 512, 00:18:19.341 "num_blocks": 65536, 00:18:19.341 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:19.341 "assigned_rate_limits": { 00:18:19.341 "rw_ios_per_sec": 0, 00:18:19.341 "rw_mbytes_per_sec": 0, 00:18:19.341 "r_mbytes_per_sec": 0, 00:18:19.341 "w_mbytes_per_sec": 0 00:18:19.341 }, 00:18:19.341 "claimed": false, 00:18:19.341 "zoned": false, 00:18:19.341 "supported_io_types": { 00:18:19.341 "read": true, 00:18:19.341 "write": true, 00:18:19.341 "unmap": true, 00:18:19.341 "flush": true, 00:18:19.341 "reset": true, 00:18:19.341 "nvme_admin": false, 00:18:19.341 "nvme_io": false, 00:18:19.341 "nvme_io_md": false, 00:18:19.341 "write_zeroes": true, 00:18:19.341 "zcopy": true, 00:18:19.341 "get_zone_info": false, 00:18:19.341 "zone_management": false, 00:18:19.341 "zone_append": false, 00:18:19.341 "compare": false, 00:18:19.341 "compare_and_write": false, 00:18:19.341 "abort": true, 00:18:19.341 "seek_hole": false, 00:18:19.341 "seek_data": false, 00:18:19.341 "copy": true, 00:18:19.341 "nvme_iov_md": false 00:18:19.341 }, 00:18:19.341 "memory_domains": [ 00:18:19.341 { 00:18:19.341 "dma_device_id": "system", 00:18:19.341 "dma_device_type": 1 00:18:19.341 }, 00:18:19.341 { 00:18:19.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.341 "dma_device_type": 2 00:18:19.341 } 00:18:19.341 ], 00:18:19.341 "driver_specific": {} 00:18:19.341 } 00:18:19.341 ] 00:18:19.341 17:12:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:19.341 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:19.341 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:19.341 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:19.600 [2024-07-23 17:12:14.894208] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:19.600 [2024-07-23 17:12:14.894250] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:19.600 [2024-07-23 17:12:14.894269] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:19.600 [2024-07-23 17:12:14.895570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.600 17:12:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:19.859 17:12:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.859 "name": "Existed_Raid", 00:18:19.859 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:19.859 "strip_size_kb": 64, 00:18:19.859 "state": "configuring", 00:18:19.859 "raid_level": "raid0", 00:18:19.859 "superblock": true, 00:18:19.860 "num_base_bdevs": 3, 00:18:19.860 "num_base_bdevs_discovered": 2, 00:18:19.860 "num_base_bdevs_operational": 3, 00:18:19.860 "base_bdevs_list": [ 00:18:19.860 { 00:18:19.860 "name": "BaseBdev1", 00:18:19.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:19.860 "is_configured": false, 00:18:19.860 "data_offset": 0, 00:18:19.860 "data_size": 0 00:18:19.860 }, 00:18:19.860 { 00:18:19.860 "name": "BaseBdev2", 00:18:19.860 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:19.860 "is_configured": true, 00:18:19.860 "data_offset": 2048, 00:18:19.860 "data_size": 63488 00:18:19.860 }, 00:18:19.860 { 00:18:19.860 "name": "BaseBdev3", 00:18:19.860 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:19.860 "is_configured": true, 00:18:19.860 "data_offset": 2048, 00:18:19.860 "data_size": 63488 00:18:19.860 } 00:18:19.860 ] 00:18:19.860 }' 00:18:19.860 17:12:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.860 17:12:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.428 17:12:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:20.687 [2024-07-23 17:12:15.989078] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.687 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.950 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.950 "name": "Existed_Raid", 00:18:20.950 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:20.950 "strip_size_kb": 64, 00:18:20.950 "state": "configuring", 00:18:20.950 "raid_level": "raid0", 00:18:20.950 "superblock": true, 00:18:20.950 "num_base_bdevs": 3, 00:18:20.950 "num_base_bdevs_discovered": 1, 00:18:20.950 "num_base_bdevs_operational": 3, 00:18:20.950 "base_bdevs_list": [ 00:18:20.950 { 00:18:20.950 "name": "BaseBdev1", 00:18:20.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.950 "is_configured": false, 00:18:20.950 "data_offset": 0, 00:18:20.950 "data_size": 0 00:18:20.950 }, 00:18:20.950 { 00:18:20.950 "name": null, 00:18:20.950 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:20.950 "is_configured": false, 00:18:20.950 "data_offset": 2048, 00:18:20.950 "data_size": 63488 00:18:20.950 }, 00:18:20.950 { 00:18:20.950 "name": "BaseBdev3", 00:18:20.950 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:20.950 "is_configured": true, 00:18:20.950 "data_offset": 2048, 00:18:20.950 "data_size": 63488 00:18:20.950 } 00:18:20.950 ] 00:18:20.950 }' 00:18:20.950 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.950 17:12:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:21.518 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.518 17:12:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:21.778 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:21.778 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:22.037 [2024-07-23 17:12:17.361300] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:22.037 BaseBdev1 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:22.037 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:22.296 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:22.556 [ 00:18:22.556 { 00:18:22.556 "name": "BaseBdev1", 00:18:22.556 "aliases": [ 00:18:22.556 "799966a6-f2d5-4fb3-a41a-ed47692196ee" 00:18:22.556 ], 00:18:22.556 "product_name": "Malloc disk", 00:18:22.556 "block_size": 512, 00:18:22.556 "num_blocks": 65536, 00:18:22.556 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:22.556 "assigned_rate_limits": { 00:18:22.556 "rw_ios_per_sec": 0, 00:18:22.556 "rw_mbytes_per_sec": 0, 00:18:22.556 "r_mbytes_per_sec": 0, 00:18:22.556 "w_mbytes_per_sec": 0 00:18:22.556 }, 00:18:22.556 "claimed": true, 00:18:22.556 "claim_type": "exclusive_write", 00:18:22.556 "zoned": false, 00:18:22.556 "supported_io_types": { 00:18:22.556 "read": true, 00:18:22.556 "write": true, 00:18:22.556 "unmap": true, 00:18:22.556 "flush": true, 00:18:22.556 "reset": true, 00:18:22.556 "nvme_admin": false, 00:18:22.556 "nvme_io": false, 00:18:22.556 "nvme_io_md": false, 00:18:22.556 "write_zeroes": true, 00:18:22.556 "zcopy": true, 00:18:22.556 "get_zone_info": false, 00:18:22.556 "zone_management": false, 00:18:22.556 "zone_append": false, 00:18:22.556 "compare": false, 00:18:22.556 "compare_and_write": false, 00:18:22.556 "abort": true, 00:18:22.556 "seek_hole": false, 00:18:22.556 "seek_data": false, 00:18:22.556 "copy": true, 00:18:22.556 "nvme_iov_md": false 00:18:22.556 }, 00:18:22.556 "memory_domains": [ 00:18:22.556 { 00:18:22.556 "dma_device_id": "system", 00:18:22.556 "dma_device_type": 1 00:18:22.556 }, 00:18:22.556 { 00:18:22.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.556 "dma_device_type": 2 00:18:22.556 } 00:18:22.556 ], 00:18:22.556 "driver_specific": {} 00:18:22.556 } 00:18:22.556 ] 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.557 17:12:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.816 17:12:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.816 "name": "Existed_Raid", 00:18:22.816 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:22.816 "strip_size_kb": 64, 00:18:22.816 "state": "configuring", 00:18:22.816 "raid_level": "raid0", 00:18:22.816 "superblock": true, 00:18:22.816 "num_base_bdevs": 3, 00:18:22.816 "num_base_bdevs_discovered": 2, 00:18:22.816 "num_base_bdevs_operational": 3, 00:18:22.816 "base_bdevs_list": [ 00:18:22.816 { 00:18:22.816 "name": "BaseBdev1", 00:18:22.816 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:22.816 "is_configured": true, 00:18:22.816 "data_offset": 2048, 00:18:22.816 "data_size": 63488 00:18:22.816 }, 00:18:22.816 { 00:18:22.816 "name": null, 00:18:22.816 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:22.816 "is_configured": false, 00:18:22.816 "data_offset": 2048, 00:18:22.816 "data_size": 63488 00:18:22.816 }, 00:18:22.816 { 00:18:22.816 "name": "BaseBdev3", 00:18:22.816 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:22.816 "is_configured": true, 00:18:22.816 "data_offset": 2048, 00:18:22.816 "data_size": 63488 00:18:22.816 } 00:18:22.816 ] 00:18:22.816 }' 00:18:22.816 17:12:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.816 17:12:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.384 17:12:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.384 17:12:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:23.644 17:12:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:23.644 17:12:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:23.903 [2024-07-23 17:12:19.182303] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:23.903 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.162 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.162 "name": "Existed_Raid", 00:18:24.162 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:24.162 "strip_size_kb": 64, 00:18:24.162 "state": "configuring", 00:18:24.162 "raid_level": "raid0", 00:18:24.162 "superblock": true, 00:18:24.162 "num_base_bdevs": 3, 00:18:24.162 "num_base_bdevs_discovered": 1, 00:18:24.162 "num_base_bdevs_operational": 3, 00:18:24.162 "base_bdevs_list": [ 00:18:24.162 { 00:18:24.162 "name": "BaseBdev1", 00:18:24.162 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:24.162 "is_configured": true, 00:18:24.162 "data_offset": 2048, 00:18:24.162 "data_size": 63488 00:18:24.162 }, 00:18:24.162 { 00:18:24.162 "name": null, 00:18:24.162 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:24.162 "is_configured": false, 00:18:24.162 "data_offset": 2048, 00:18:24.162 "data_size": 63488 00:18:24.162 }, 00:18:24.162 { 00:18:24.162 "name": null, 00:18:24.162 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:24.162 "is_configured": false, 00:18:24.162 "data_offset": 2048, 00:18:24.162 "data_size": 63488 00:18:24.162 } 00:18:24.162 ] 00:18:24.162 }' 00:18:24.162 17:12:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.162 17:12:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:24.731 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.731 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:24.991 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:24.991 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:25.250 [2024-07-23 17:12:20.557961] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.250 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.251 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.251 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.510 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.510 "name": "Existed_Raid", 00:18:25.510 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:25.510 "strip_size_kb": 64, 00:18:25.510 "state": "configuring", 00:18:25.510 "raid_level": "raid0", 00:18:25.510 "superblock": true, 00:18:25.510 "num_base_bdevs": 3, 00:18:25.510 "num_base_bdevs_discovered": 2, 00:18:25.510 "num_base_bdevs_operational": 3, 00:18:25.510 "base_bdevs_list": [ 00:18:25.510 { 00:18:25.510 "name": "BaseBdev1", 00:18:25.510 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:25.510 "is_configured": true, 00:18:25.510 "data_offset": 2048, 00:18:25.510 "data_size": 63488 00:18:25.510 }, 00:18:25.510 { 00:18:25.510 "name": null, 00:18:25.510 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:25.510 "is_configured": false, 00:18:25.510 "data_offset": 2048, 00:18:25.510 "data_size": 63488 00:18:25.510 }, 00:18:25.510 { 00:18:25.510 "name": "BaseBdev3", 00:18:25.510 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:25.510 "is_configured": true, 00:18:25.510 "data_offset": 2048, 00:18:25.510 "data_size": 63488 00:18:25.510 } 00:18:25.510 ] 00:18:25.510 }' 00:18:25.510 17:12:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.510 17:12:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.078 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.078 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:26.338 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:26.338 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:26.597 [2024-07-23 17:12:21.921759] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.597 17:12:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.856 17:12:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.856 "name": "Existed_Raid", 00:18:26.856 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:26.856 "strip_size_kb": 64, 00:18:26.856 "state": "configuring", 00:18:26.856 "raid_level": "raid0", 00:18:26.856 "superblock": true, 00:18:26.856 "num_base_bdevs": 3, 00:18:26.856 "num_base_bdevs_discovered": 1, 00:18:26.856 "num_base_bdevs_operational": 3, 00:18:26.856 "base_bdevs_list": [ 00:18:26.856 { 00:18:26.856 "name": null, 00:18:26.856 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:26.856 "is_configured": false, 00:18:26.856 "data_offset": 2048, 00:18:26.856 "data_size": 63488 00:18:26.856 }, 00:18:26.856 { 00:18:26.856 "name": null, 00:18:26.856 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:26.856 "is_configured": false, 00:18:26.856 "data_offset": 2048, 00:18:26.856 "data_size": 63488 00:18:26.856 }, 00:18:26.856 { 00:18:26.856 "name": "BaseBdev3", 00:18:26.856 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:26.856 "is_configured": true, 00:18:26.856 "data_offset": 2048, 00:18:26.856 "data_size": 63488 00:18:26.856 } 00:18:26.856 ] 00:18:26.856 }' 00:18:26.856 17:12:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.856 17:12:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.793 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.793 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:28.052 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:28.052 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:28.311 [2024-07-23 17:12:23.562521] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.311 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.570 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.570 "name": "Existed_Raid", 00:18:28.570 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:28.570 "strip_size_kb": 64, 00:18:28.570 "state": "configuring", 00:18:28.570 "raid_level": "raid0", 00:18:28.570 "superblock": true, 00:18:28.570 "num_base_bdevs": 3, 00:18:28.570 "num_base_bdevs_discovered": 2, 00:18:28.570 "num_base_bdevs_operational": 3, 00:18:28.570 "base_bdevs_list": [ 00:18:28.570 { 00:18:28.570 "name": null, 00:18:28.570 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:28.570 "is_configured": false, 00:18:28.570 "data_offset": 2048, 00:18:28.570 "data_size": 63488 00:18:28.570 }, 00:18:28.570 { 00:18:28.570 "name": "BaseBdev2", 00:18:28.570 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:28.570 "is_configured": true, 00:18:28.570 "data_offset": 2048, 00:18:28.570 "data_size": 63488 00:18:28.570 }, 00:18:28.570 { 00:18:28.570 "name": "BaseBdev3", 00:18:28.570 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:28.570 "is_configured": true, 00:18:28.570 "data_offset": 2048, 00:18:28.570 "data_size": 63488 00:18:28.570 } 00:18:28.570 ] 00:18:28.570 }' 00:18:28.570 17:12:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.570 17:12:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.507 17:12:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.507 17:12:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:29.766 17:12:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:29.766 17:12:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.766 17:12:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 799966a6-f2d5-4fb3-a41a-ed47692196ee 00:18:30.026 [2024-07-23 17:12:25.419810] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:30.026 [2024-07-23 17:12:25.419982] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d97110 00:18:30.026 [2024-07-23 17:12:25.419997] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:30.026 [2024-07-23 17:12:25.420171] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bdcd40 00:18:30.026 [2024-07-23 17:12:25.420284] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d97110 00:18:30.026 [2024-07-23 17:12:25.420294] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d97110 00:18:30.026 [2024-07-23 17:12:25.420385] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:30.026 NewBaseBdev 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:30.026 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:30.594 17:12:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:30.852 [ 00:18:30.852 { 00:18:30.852 "name": "NewBaseBdev", 00:18:30.852 "aliases": [ 00:18:30.852 "799966a6-f2d5-4fb3-a41a-ed47692196ee" 00:18:30.852 ], 00:18:30.852 "product_name": "Malloc disk", 00:18:30.852 "block_size": 512, 00:18:30.852 "num_blocks": 65536, 00:18:30.852 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:30.852 "assigned_rate_limits": { 00:18:30.852 "rw_ios_per_sec": 0, 00:18:30.852 "rw_mbytes_per_sec": 0, 00:18:30.852 "r_mbytes_per_sec": 0, 00:18:30.852 "w_mbytes_per_sec": 0 00:18:30.852 }, 00:18:30.852 "claimed": true, 00:18:30.852 "claim_type": "exclusive_write", 00:18:30.852 "zoned": false, 00:18:30.852 "supported_io_types": { 00:18:30.852 "read": true, 00:18:30.852 "write": true, 00:18:30.852 "unmap": true, 00:18:30.852 "flush": true, 00:18:30.852 "reset": true, 00:18:30.852 "nvme_admin": false, 00:18:30.852 "nvme_io": false, 00:18:30.852 "nvme_io_md": false, 00:18:30.852 "write_zeroes": true, 00:18:30.852 "zcopy": true, 00:18:30.852 "get_zone_info": false, 00:18:30.852 "zone_management": false, 00:18:30.852 "zone_append": false, 00:18:30.852 "compare": false, 00:18:30.852 "compare_and_write": false, 00:18:30.852 "abort": true, 00:18:30.852 "seek_hole": false, 00:18:30.852 "seek_data": false, 00:18:30.852 "copy": true, 00:18:30.852 "nvme_iov_md": false 00:18:30.852 }, 00:18:30.852 "memory_domains": [ 00:18:30.852 { 00:18:30.852 "dma_device_id": "system", 00:18:30.852 "dma_device_type": 1 00:18:30.852 }, 00:18:30.852 { 00:18:30.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.852 "dma_device_type": 2 00:18:30.852 } 00:18:30.852 ], 00:18:30.852 "driver_specific": {} 00:18:30.852 } 00:18:30.852 ] 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.852 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.853 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.853 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.853 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.853 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.111 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.111 "name": "Existed_Raid", 00:18:31.111 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:31.111 "strip_size_kb": 64, 00:18:31.111 "state": "online", 00:18:31.111 "raid_level": "raid0", 00:18:31.111 "superblock": true, 00:18:31.111 "num_base_bdevs": 3, 00:18:31.111 "num_base_bdevs_discovered": 3, 00:18:31.111 "num_base_bdevs_operational": 3, 00:18:31.111 "base_bdevs_list": [ 00:18:31.111 { 00:18:31.111 "name": "NewBaseBdev", 00:18:31.111 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:31.111 "is_configured": true, 00:18:31.111 "data_offset": 2048, 00:18:31.111 "data_size": 63488 00:18:31.111 }, 00:18:31.111 { 00:18:31.111 "name": "BaseBdev2", 00:18:31.111 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:31.111 "is_configured": true, 00:18:31.111 "data_offset": 2048, 00:18:31.111 "data_size": 63488 00:18:31.111 }, 00:18:31.111 { 00:18:31.111 "name": "BaseBdev3", 00:18:31.111 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:31.111 "is_configured": true, 00:18:31.111 "data_offset": 2048, 00:18:31.111 "data_size": 63488 00:18:31.111 } 00:18:31.111 ] 00:18:31.111 }' 00:18:31.111 17:12:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.111 17:12:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.677 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:31.678 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:31.936 [2024-07-23 17:12:27.192791] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:31.936 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:31.936 "name": "Existed_Raid", 00:18:31.936 "aliases": [ 00:18:31.936 "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b" 00:18:31.936 ], 00:18:31.936 "product_name": "Raid Volume", 00:18:31.936 "block_size": 512, 00:18:31.936 "num_blocks": 190464, 00:18:31.936 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:31.936 "assigned_rate_limits": { 00:18:31.936 "rw_ios_per_sec": 0, 00:18:31.936 "rw_mbytes_per_sec": 0, 00:18:31.936 "r_mbytes_per_sec": 0, 00:18:31.936 "w_mbytes_per_sec": 0 00:18:31.936 }, 00:18:31.936 "claimed": false, 00:18:31.936 "zoned": false, 00:18:31.936 "supported_io_types": { 00:18:31.936 "read": true, 00:18:31.936 "write": true, 00:18:31.936 "unmap": true, 00:18:31.936 "flush": true, 00:18:31.936 "reset": true, 00:18:31.936 "nvme_admin": false, 00:18:31.936 "nvme_io": false, 00:18:31.936 "nvme_io_md": false, 00:18:31.936 "write_zeroes": true, 00:18:31.936 "zcopy": false, 00:18:31.936 "get_zone_info": false, 00:18:31.936 "zone_management": false, 00:18:31.936 "zone_append": false, 00:18:31.936 "compare": false, 00:18:31.936 "compare_and_write": false, 00:18:31.936 "abort": false, 00:18:31.936 "seek_hole": false, 00:18:31.936 "seek_data": false, 00:18:31.936 "copy": false, 00:18:31.936 "nvme_iov_md": false 00:18:31.936 }, 00:18:31.936 "memory_domains": [ 00:18:31.936 { 00:18:31.936 "dma_device_id": "system", 00:18:31.936 "dma_device_type": 1 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.936 "dma_device_type": 2 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "dma_device_id": "system", 00:18:31.936 "dma_device_type": 1 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.936 "dma_device_type": 2 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "dma_device_id": "system", 00:18:31.936 "dma_device_type": 1 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.936 "dma_device_type": 2 00:18:31.936 } 00:18:31.936 ], 00:18:31.936 "driver_specific": { 00:18:31.936 "raid": { 00:18:31.936 "uuid": "61d0e40e-d91f-4cf9-ab66-b774fbbb4a9b", 00:18:31.936 "strip_size_kb": 64, 00:18:31.936 "state": "online", 00:18:31.936 "raid_level": "raid0", 00:18:31.936 "superblock": true, 00:18:31.936 "num_base_bdevs": 3, 00:18:31.936 "num_base_bdevs_discovered": 3, 00:18:31.936 "num_base_bdevs_operational": 3, 00:18:31.936 "base_bdevs_list": [ 00:18:31.936 { 00:18:31.936 "name": "NewBaseBdev", 00:18:31.936 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:31.936 "is_configured": true, 00:18:31.936 "data_offset": 2048, 00:18:31.936 "data_size": 63488 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "name": "BaseBdev2", 00:18:31.936 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:31.936 "is_configured": true, 00:18:31.936 "data_offset": 2048, 00:18:31.936 "data_size": 63488 00:18:31.936 }, 00:18:31.936 { 00:18:31.936 "name": "BaseBdev3", 00:18:31.936 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:31.936 "is_configured": true, 00:18:31.936 "data_offset": 2048, 00:18:31.936 "data_size": 63488 00:18:31.936 } 00:18:31.936 ] 00:18:31.936 } 00:18:31.936 } 00:18:31.936 }' 00:18:31.936 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:31.936 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:31.936 BaseBdev2 00:18:31.936 BaseBdev3' 00:18:31.936 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:31.936 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:31.936 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.195 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.195 "name": "NewBaseBdev", 00:18:32.195 "aliases": [ 00:18:32.195 "799966a6-f2d5-4fb3-a41a-ed47692196ee" 00:18:32.195 ], 00:18:32.195 "product_name": "Malloc disk", 00:18:32.195 "block_size": 512, 00:18:32.195 "num_blocks": 65536, 00:18:32.195 "uuid": "799966a6-f2d5-4fb3-a41a-ed47692196ee", 00:18:32.195 "assigned_rate_limits": { 00:18:32.195 "rw_ios_per_sec": 0, 00:18:32.195 "rw_mbytes_per_sec": 0, 00:18:32.195 "r_mbytes_per_sec": 0, 00:18:32.195 "w_mbytes_per_sec": 0 00:18:32.195 }, 00:18:32.195 "claimed": true, 00:18:32.195 "claim_type": "exclusive_write", 00:18:32.195 "zoned": false, 00:18:32.195 "supported_io_types": { 00:18:32.195 "read": true, 00:18:32.195 "write": true, 00:18:32.195 "unmap": true, 00:18:32.195 "flush": true, 00:18:32.195 "reset": true, 00:18:32.195 "nvme_admin": false, 00:18:32.195 "nvme_io": false, 00:18:32.195 "nvme_io_md": false, 00:18:32.195 "write_zeroes": true, 00:18:32.195 "zcopy": true, 00:18:32.195 "get_zone_info": false, 00:18:32.195 "zone_management": false, 00:18:32.195 "zone_append": false, 00:18:32.195 "compare": false, 00:18:32.195 "compare_and_write": false, 00:18:32.195 "abort": true, 00:18:32.195 "seek_hole": false, 00:18:32.195 "seek_data": false, 00:18:32.195 "copy": true, 00:18:32.195 "nvme_iov_md": false 00:18:32.195 }, 00:18:32.195 "memory_domains": [ 00:18:32.195 { 00:18:32.195 "dma_device_id": "system", 00:18:32.195 "dma_device_type": 1 00:18:32.195 }, 00:18:32.195 { 00:18:32.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.195 "dma_device_type": 2 00:18:32.195 } 00:18:32.195 ], 00:18:32.195 "driver_specific": {} 00:18:32.195 }' 00:18:32.195 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.195 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.195 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.195 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.453 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.712 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.712 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.712 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:32.712 17:12:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.712 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.712 "name": "BaseBdev2", 00:18:32.712 "aliases": [ 00:18:32.712 "abdb858d-50de-4a42-898f-3251fad6bacd" 00:18:32.712 ], 00:18:32.712 "product_name": "Malloc disk", 00:18:32.712 "block_size": 512, 00:18:32.712 "num_blocks": 65536, 00:18:32.712 "uuid": "abdb858d-50de-4a42-898f-3251fad6bacd", 00:18:32.712 "assigned_rate_limits": { 00:18:32.712 "rw_ios_per_sec": 0, 00:18:32.712 "rw_mbytes_per_sec": 0, 00:18:32.712 "r_mbytes_per_sec": 0, 00:18:32.712 "w_mbytes_per_sec": 0 00:18:32.712 }, 00:18:32.712 "claimed": true, 00:18:32.712 "claim_type": "exclusive_write", 00:18:32.712 "zoned": false, 00:18:32.712 "supported_io_types": { 00:18:32.712 "read": true, 00:18:32.712 "write": true, 00:18:32.712 "unmap": true, 00:18:32.712 "flush": true, 00:18:32.712 "reset": true, 00:18:32.712 "nvme_admin": false, 00:18:32.712 "nvme_io": false, 00:18:32.712 "nvme_io_md": false, 00:18:32.712 "write_zeroes": true, 00:18:32.712 "zcopy": true, 00:18:32.712 "get_zone_info": false, 00:18:32.712 "zone_management": false, 00:18:32.712 "zone_append": false, 00:18:32.712 "compare": false, 00:18:32.712 "compare_and_write": false, 00:18:32.712 "abort": true, 00:18:32.712 "seek_hole": false, 00:18:32.712 "seek_data": false, 00:18:32.712 "copy": true, 00:18:32.712 "nvme_iov_md": false 00:18:32.712 }, 00:18:32.712 "memory_domains": [ 00:18:32.712 { 00:18:32.712 "dma_device_id": "system", 00:18:32.712 "dma_device_type": 1 00:18:32.712 }, 00:18:32.712 { 00:18:32.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.712 "dma_device_type": 2 00:18:32.712 } 00:18:32.712 ], 00:18:32.712 "driver_specific": {} 00:18:32.712 }' 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.970 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:33.229 "name": "BaseBdev3", 00:18:33.229 "aliases": [ 00:18:33.229 "d0bfa408-5d5c-4d5e-ac43-65f952b698d0" 00:18:33.229 ], 00:18:33.229 "product_name": "Malloc disk", 00:18:33.229 "block_size": 512, 00:18:33.229 "num_blocks": 65536, 00:18:33.229 "uuid": "d0bfa408-5d5c-4d5e-ac43-65f952b698d0", 00:18:33.229 "assigned_rate_limits": { 00:18:33.229 "rw_ios_per_sec": 0, 00:18:33.229 "rw_mbytes_per_sec": 0, 00:18:33.229 "r_mbytes_per_sec": 0, 00:18:33.229 "w_mbytes_per_sec": 0 00:18:33.229 }, 00:18:33.229 "claimed": true, 00:18:33.229 "claim_type": "exclusive_write", 00:18:33.229 "zoned": false, 00:18:33.229 "supported_io_types": { 00:18:33.229 "read": true, 00:18:33.229 "write": true, 00:18:33.229 "unmap": true, 00:18:33.229 "flush": true, 00:18:33.229 "reset": true, 00:18:33.229 "nvme_admin": false, 00:18:33.229 "nvme_io": false, 00:18:33.229 "nvme_io_md": false, 00:18:33.229 "write_zeroes": true, 00:18:33.229 "zcopy": true, 00:18:33.229 "get_zone_info": false, 00:18:33.229 "zone_management": false, 00:18:33.229 "zone_append": false, 00:18:33.229 "compare": false, 00:18:33.229 "compare_and_write": false, 00:18:33.229 "abort": true, 00:18:33.229 "seek_hole": false, 00:18:33.229 "seek_data": false, 00:18:33.229 "copy": true, 00:18:33.229 "nvme_iov_md": false 00:18:33.229 }, 00:18:33.229 "memory_domains": [ 00:18:33.229 { 00:18:33.229 "dma_device_id": "system", 00:18:33.229 "dma_device_type": 1 00:18:33.229 }, 00:18:33.229 { 00:18:33.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.229 "dma_device_type": 2 00:18:33.229 } 00:18:33.229 ], 00:18:33.229 "driver_specific": {} 00:18:33.229 }' 00:18:33.229 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.487 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.746 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.746 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.746 17:12:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:34.004 [2024-07-23 17:12:29.198030] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:34.004 [2024-07-23 17:12:29.198062] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:34.004 [2024-07-23 17:12:29.198121] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:34.004 [2024-07-23 17:12:29.198172] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:34.004 [2024-07-23 17:12:29.198184] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d97110 name Existed_Raid, state offline 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4133389 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4133389 ']' 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4133389 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4133389 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4133389' 00:18:34.004 killing process with pid 4133389 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4133389 00:18:34.004 [2024-07-23 17:12:29.268517] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:34.004 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4133389 00:18:34.004 [2024-07-23 17:12:29.299769] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:34.263 17:12:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:34.263 00:18:34.263 real 0m30.348s 00:18:34.263 user 0m55.755s 00:18:34.263 sys 0m5.355s 00:18:34.263 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:34.263 17:12:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.263 ************************************ 00:18:34.263 END TEST raid_state_function_test_sb 00:18:34.263 ************************************ 00:18:34.263 17:12:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:34.263 17:12:29 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:18:34.263 17:12:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:34.264 17:12:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:34.264 17:12:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:34.264 ************************************ 00:18:34.264 START TEST raid_superblock_test 00:18:34.264 ************************************ 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4138519 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4138519 /var/tmp/spdk-raid.sock 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4138519 ']' 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:34.264 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:34.264 17:12:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.264 [2024-07-23 17:12:29.679189] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:18:34.264 [2024-07-23 17:12:29.679255] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4138519 ] 00:18:34.523 [2024-07-23 17:12:29.812680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:34.523 [2024-07-23 17:12:29.864912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:34.523 [2024-07-23 17:12:29.921322] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:34.523 [2024-07-23 17:12:29.921347] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:34.831 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:35.111 malloc1 00:18:35.111 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:35.371 [2024-07-23 17:12:30.630189] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:35.371 [2024-07-23 17:12:30.630238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.371 [2024-07-23 17:12:30.630262] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110b070 00:18:35.371 [2024-07-23 17:12:30.630274] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.371 [2024-07-23 17:12:30.631950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.371 [2024-07-23 17:12:30.631979] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:35.371 pt1 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:35.371 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:35.630 malloc2 00:18:35.630 17:12:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:35.890 [2024-07-23 17:12:31.129452] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:35.890 [2024-07-23 17:12:31.129498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:35.890 [2024-07-23 17:12:31.129516] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xff1920 00:18:35.890 [2024-07-23 17:12:31.129528] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:35.890 [2024-07-23 17:12:31.131271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:35.890 [2024-07-23 17:12:31.131303] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:35.890 pt2 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:35.890 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:36.459 malloc3 00:18:36.459 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:36.718 [2024-07-23 17:12:31.897458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:36.718 [2024-07-23 17:12:31.897503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.718 [2024-07-23 17:12:31.897520] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11033e0 00:18:36.718 [2024-07-23 17:12:31.897532] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.718 [2024-07-23 17:12:31.899091] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.718 [2024-07-23 17:12:31.899120] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:36.718 pt3 00:18:36.718 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:36.718 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:36.718 17:12:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:18:36.978 [2024-07-23 17:12:32.394779] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:36.978 [2024-07-23 17:12:32.396135] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:36.978 [2024-07-23 17:12:32.396189] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:36.978 [2024-07-23 17:12:32.396343] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1105870 00:18:36.978 [2024-07-23 17:12:32.396354] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:36.978 [2024-07-23 17:12:32.396554] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf705e0 00:18:36.978 [2024-07-23 17:12:32.396696] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1105870 00:18:36.978 [2024-07-23 17:12:32.396706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1105870 00:18:36.978 [2024-07-23 17:12:32.396804] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.237 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.496 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.496 "name": "raid_bdev1", 00:18:37.496 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:37.496 "strip_size_kb": 64, 00:18:37.496 "state": "online", 00:18:37.496 "raid_level": "raid0", 00:18:37.496 "superblock": true, 00:18:37.496 "num_base_bdevs": 3, 00:18:37.496 "num_base_bdevs_discovered": 3, 00:18:37.496 "num_base_bdevs_operational": 3, 00:18:37.496 "base_bdevs_list": [ 00:18:37.496 { 00:18:37.496 "name": "pt1", 00:18:37.496 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:37.496 "is_configured": true, 00:18:37.496 "data_offset": 2048, 00:18:37.496 "data_size": 63488 00:18:37.496 }, 00:18:37.496 { 00:18:37.496 "name": "pt2", 00:18:37.496 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:37.496 "is_configured": true, 00:18:37.496 "data_offset": 2048, 00:18:37.496 "data_size": 63488 00:18:37.496 }, 00:18:37.496 { 00:18:37.496 "name": "pt3", 00:18:37.496 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:37.496 "is_configured": true, 00:18:37.496 "data_offset": 2048, 00:18:37.496 "data_size": 63488 00:18:37.496 } 00:18:37.496 ] 00:18:37.496 }' 00:18:37.496 17:12:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.496 17:12:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:38.064 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:38.323 [2024-07-23 17:12:33.517992] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:38.323 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:38.323 "name": "raid_bdev1", 00:18:38.323 "aliases": [ 00:18:38.323 "cb6233d5-a275-4777-beb7-2ce04ade60d8" 00:18:38.323 ], 00:18:38.323 "product_name": "Raid Volume", 00:18:38.323 "block_size": 512, 00:18:38.323 "num_blocks": 190464, 00:18:38.323 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:38.323 "assigned_rate_limits": { 00:18:38.323 "rw_ios_per_sec": 0, 00:18:38.323 "rw_mbytes_per_sec": 0, 00:18:38.323 "r_mbytes_per_sec": 0, 00:18:38.323 "w_mbytes_per_sec": 0 00:18:38.323 }, 00:18:38.323 "claimed": false, 00:18:38.323 "zoned": false, 00:18:38.323 "supported_io_types": { 00:18:38.323 "read": true, 00:18:38.323 "write": true, 00:18:38.323 "unmap": true, 00:18:38.323 "flush": true, 00:18:38.323 "reset": true, 00:18:38.323 "nvme_admin": false, 00:18:38.323 "nvme_io": false, 00:18:38.323 "nvme_io_md": false, 00:18:38.323 "write_zeroes": true, 00:18:38.323 "zcopy": false, 00:18:38.323 "get_zone_info": false, 00:18:38.323 "zone_management": false, 00:18:38.323 "zone_append": false, 00:18:38.323 "compare": false, 00:18:38.323 "compare_and_write": false, 00:18:38.323 "abort": false, 00:18:38.323 "seek_hole": false, 00:18:38.323 "seek_data": false, 00:18:38.323 "copy": false, 00:18:38.323 "nvme_iov_md": false 00:18:38.323 }, 00:18:38.323 "memory_domains": [ 00:18:38.323 { 00:18:38.323 "dma_device_id": "system", 00:18:38.323 "dma_device_type": 1 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.323 "dma_device_type": 2 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "dma_device_id": "system", 00:18:38.323 "dma_device_type": 1 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.323 "dma_device_type": 2 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "dma_device_id": "system", 00:18:38.323 "dma_device_type": 1 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.323 "dma_device_type": 2 00:18:38.323 } 00:18:38.323 ], 00:18:38.323 "driver_specific": { 00:18:38.323 "raid": { 00:18:38.323 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:38.323 "strip_size_kb": 64, 00:18:38.323 "state": "online", 00:18:38.323 "raid_level": "raid0", 00:18:38.323 "superblock": true, 00:18:38.323 "num_base_bdevs": 3, 00:18:38.323 "num_base_bdevs_discovered": 3, 00:18:38.323 "num_base_bdevs_operational": 3, 00:18:38.323 "base_bdevs_list": [ 00:18:38.323 { 00:18:38.323 "name": "pt1", 00:18:38.323 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:38.323 "is_configured": true, 00:18:38.323 "data_offset": 2048, 00:18:38.323 "data_size": 63488 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "name": "pt2", 00:18:38.323 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:38.323 "is_configured": true, 00:18:38.323 "data_offset": 2048, 00:18:38.323 "data_size": 63488 00:18:38.323 }, 00:18:38.323 { 00:18:38.323 "name": "pt3", 00:18:38.323 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:38.323 "is_configured": true, 00:18:38.323 "data_offset": 2048, 00:18:38.323 "data_size": 63488 00:18:38.323 } 00:18:38.323 ] 00:18:38.323 } 00:18:38.323 } 00:18:38.323 }' 00:18:38.323 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:38.323 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:38.323 pt2 00:18:38.323 pt3' 00:18:38.323 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.323 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:38.323 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.582 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.582 "name": "pt1", 00:18:38.582 "aliases": [ 00:18:38.582 "00000000-0000-0000-0000-000000000001" 00:18:38.582 ], 00:18:38.582 "product_name": "passthru", 00:18:38.582 "block_size": 512, 00:18:38.582 "num_blocks": 65536, 00:18:38.582 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:38.582 "assigned_rate_limits": { 00:18:38.582 "rw_ios_per_sec": 0, 00:18:38.582 "rw_mbytes_per_sec": 0, 00:18:38.582 "r_mbytes_per_sec": 0, 00:18:38.582 "w_mbytes_per_sec": 0 00:18:38.582 }, 00:18:38.582 "claimed": true, 00:18:38.582 "claim_type": "exclusive_write", 00:18:38.582 "zoned": false, 00:18:38.582 "supported_io_types": { 00:18:38.582 "read": true, 00:18:38.582 "write": true, 00:18:38.582 "unmap": true, 00:18:38.582 "flush": true, 00:18:38.582 "reset": true, 00:18:38.582 "nvme_admin": false, 00:18:38.582 "nvme_io": false, 00:18:38.582 "nvme_io_md": false, 00:18:38.582 "write_zeroes": true, 00:18:38.582 "zcopy": true, 00:18:38.582 "get_zone_info": false, 00:18:38.582 "zone_management": false, 00:18:38.582 "zone_append": false, 00:18:38.582 "compare": false, 00:18:38.582 "compare_and_write": false, 00:18:38.582 "abort": true, 00:18:38.582 "seek_hole": false, 00:18:38.582 "seek_data": false, 00:18:38.582 "copy": true, 00:18:38.582 "nvme_iov_md": false 00:18:38.582 }, 00:18:38.582 "memory_domains": [ 00:18:38.582 { 00:18:38.582 "dma_device_id": "system", 00:18:38.582 "dma_device_type": 1 00:18:38.582 }, 00:18:38.582 { 00:18:38.582 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.582 "dma_device_type": 2 00:18:38.582 } 00:18:38.582 ], 00:18:38.582 "driver_specific": { 00:18:38.582 "passthru": { 00:18:38.582 "name": "pt1", 00:18:38.582 "base_bdev_name": "malloc1" 00:18:38.582 } 00:18:38.582 } 00:18:38.582 }' 00:18:38.582 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.582 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.582 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.582 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.582 17:12:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:38.841 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:39.100 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:39.100 "name": "pt2", 00:18:39.100 "aliases": [ 00:18:39.100 "00000000-0000-0000-0000-000000000002" 00:18:39.100 ], 00:18:39.100 "product_name": "passthru", 00:18:39.100 "block_size": 512, 00:18:39.100 "num_blocks": 65536, 00:18:39.100 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:39.100 "assigned_rate_limits": { 00:18:39.100 "rw_ios_per_sec": 0, 00:18:39.100 "rw_mbytes_per_sec": 0, 00:18:39.100 "r_mbytes_per_sec": 0, 00:18:39.100 "w_mbytes_per_sec": 0 00:18:39.100 }, 00:18:39.100 "claimed": true, 00:18:39.100 "claim_type": "exclusive_write", 00:18:39.100 "zoned": false, 00:18:39.100 "supported_io_types": { 00:18:39.100 "read": true, 00:18:39.100 "write": true, 00:18:39.100 "unmap": true, 00:18:39.100 "flush": true, 00:18:39.100 "reset": true, 00:18:39.100 "nvme_admin": false, 00:18:39.100 "nvme_io": false, 00:18:39.100 "nvme_io_md": false, 00:18:39.100 "write_zeroes": true, 00:18:39.100 "zcopy": true, 00:18:39.100 "get_zone_info": false, 00:18:39.100 "zone_management": false, 00:18:39.100 "zone_append": false, 00:18:39.100 "compare": false, 00:18:39.100 "compare_and_write": false, 00:18:39.100 "abort": true, 00:18:39.100 "seek_hole": false, 00:18:39.100 "seek_data": false, 00:18:39.100 "copy": true, 00:18:39.100 "nvme_iov_md": false 00:18:39.100 }, 00:18:39.100 "memory_domains": [ 00:18:39.100 { 00:18:39.100 "dma_device_id": "system", 00:18:39.100 "dma_device_type": 1 00:18:39.100 }, 00:18:39.100 { 00:18:39.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.100 "dma_device_type": 2 00:18:39.100 } 00:18:39.100 ], 00:18:39.100 "driver_specific": { 00:18:39.100 "passthru": { 00:18:39.100 "name": "pt2", 00:18:39.100 "base_bdev_name": "malloc2" 00:18:39.100 } 00:18:39.100 } 00:18:39.100 }' 00:18:39.100 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.100 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.359 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.618 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.618 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:39.618 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:39.618 17:12:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:39.877 "name": "pt3", 00:18:39.877 "aliases": [ 00:18:39.877 "00000000-0000-0000-0000-000000000003" 00:18:39.877 ], 00:18:39.877 "product_name": "passthru", 00:18:39.877 "block_size": 512, 00:18:39.877 "num_blocks": 65536, 00:18:39.877 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:39.877 "assigned_rate_limits": { 00:18:39.877 "rw_ios_per_sec": 0, 00:18:39.877 "rw_mbytes_per_sec": 0, 00:18:39.877 "r_mbytes_per_sec": 0, 00:18:39.877 "w_mbytes_per_sec": 0 00:18:39.877 }, 00:18:39.877 "claimed": true, 00:18:39.877 "claim_type": "exclusive_write", 00:18:39.877 "zoned": false, 00:18:39.877 "supported_io_types": { 00:18:39.877 "read": true, 00:18:39.877 "write": true, 00:18:39.877 "unmap": true, 00:18:39.877 "flush": true, 00:18:39.877 "reset": true, 00:18:39.877 "nvme_admin": false, 00:18:39.877 "nvme_io": false, 00:18:39.877 "nvme_io_md": false, 00:18:39.877 "write_zeroes": true, 00:18:39.877 "zcopy": true, 00:18:39.877 "get_zone_info": false, 00:18:39.877 "zone_management": false, 00:18:39.877 "zone_append": false, 00:18:39.877 "compare": false, 00:18:39.877 "compare_and_write": false, 00:18:39.877 "abort": true, 00:18:39.877 "seek_hole": false, 00:18:39.877 "seek_data": false, 00:18:39.877 "copy": true, 00:18:39.877 "nvme_iov_md": false 00:18:39.877 }, 00:18:39.877 "memory_domains": [ 00:18:39.877 { 00:18:39.877 "dma_device_id": "system", 00:18:39.877 "dma_device_type": 1 00:18:39.877 }, 00:18:39.877 { 00:18:39.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.877 "dma_device_type": 2 00:18:39.877 } 00:18:39.877 ], 00:18:39.877 "driver_specific": { 00:18:39.877 "passthru": { 00:18:39.877 "name": "pt3", 00:18:39.877 "base_bdev_name": "malloc3" 00:18:39.877 } 00:18:39.877 } 00:18:39.877 }' 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.877 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.136 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:40.136 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:40.136 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:40.136 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:40.395 [2024-07-23 17:12:35.611690] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:40.395 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=cb6233d5-a275-4777-beb7-2ce04ade60d8 00:18:40.395 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z cb6233d5-a275-4777-beb7-2ce04ade60d8 ']' 00:18:40.395 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:40.654 [2024-07-23 17:12:35.856073] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:40.654 [2024-07-23 17:12:35.856096] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:40.654 [2024-07-23 17:12:35.856146] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:40.654 [2024-07-23 17:12:35.856200] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:40.654 [2024-07-23 17:12:35.856211] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1105870 name raid_bdev1, state offline 00:18:40.654 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.654 17:12:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:40.913 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:40.913 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:40.913 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:40.913 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:41.172 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:41.172 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:41.172 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:41.172 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:41.431 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:41.431 17:12:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:41.690 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:41.949 [2024-07-23 17:12:37.303846] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:41.949 [2024-07-23 17:12:37.305220] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:41.949 [2024-07-23 17:12:37.305261] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:41.949 [2024-07-23 17:12:37.305307] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:41.949 [2024-07-23 17:12:37.305345] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:41.949 [2024-07-23 17:12:37.305374] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:41.949 [2024-07-23 17:12:37.305392] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:41.949 [2024-07-23 17:12:37.305401] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xff1090 name raid_bdev1, state configuring 00:18:41.949 request: 00:18:41.949 { 00:18:41.949 "name": "raid_bdev1", 00:18:41.949 "raid_level": "raid0", 00:18:41.949 "base_bdevs": [ 00:18:41.949 "malloc1", 00:18:41.949 "malloc2", 00:18:41.949 "malloc3" 00:18:41.949 ], 00:18:41.949 "strip_size_kb": 64, 00:18:41.949 "superblock": false, 00:18:41.949 "method": "bdev_raid_create", 00:18:41.949 "req_id": 1 00:18:41.949 } 00:18:41.949 Got JSON-RPC error response 00:18:41.949 response: 00:18:41.949 { 00:18:41.949 "code": -17, 00:18:41.949 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:41.949 } 00:18:41.949 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:41.949 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:41.949 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:41.949 17:12:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:41.949 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.949 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:42.209 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:42.209 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:42.209 17:12:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:42.777 [2024-07-23 17:12:38.049750] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:42.777 [2024-07-23 17:12:38.049795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.777 [2024-07-23 17:12:38.049812] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110b3b0 00:18:42.777 [2024-07-23 17:12:38.049825] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.777 [2024-07-23 17:12:38.051432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.777 [2024-07-23 17:12:38.051462] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:42.777 [2024-07-23 17:12:38.051527] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:42.777 [2024-07-23 17:12:38.051553] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:42.777 pt1 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.777 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.035 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.036 "name": "raid_bdev1", 00:18:43.036 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:43.036 "strip_size_kb": 64, 00:18:43.036 "state": "configuring", 00:18:43.036 "raid_level": "raid0", 00:18:43.036 "superblock": true, 00:18:43.036 "num_base_bdevs": 3, 00:18:43.036 "num_base_bdevs_discovered": 1, 00:18:43.036 "num_base_bdevs_operational": 3, 00:18:43.036 "base_bdevs_list": [ 00:18:43.036 { 00:18:43.036 "name": "pt1", 00:18:43.036 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:43.036 "is_configured": true, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 }, 00:18:43.036 { 00:18:43.036 "name": null, 00:18:43.036 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:43.036 "is_configured": false, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 }, 00:18:43.036 { 00:18:43.036 "name": null, 00:18:43.036 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:43.036 "is_configured": false, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 } 00:18:43.036 ] 00:18:43.036 }' 00:18:43.036 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.036 17:12:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:43.603 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:18:43.603 17:12:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:44.170 [2024-07-23 17:12:39.417395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:44.170 [2024-07-23 17:12:39.417446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:44.170 [2024-07-23 17:12:39.417466] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1105110 00:18:44.170 [2024-07-23 17:12:39.417479] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:44.170 [2024-07-23 17:12:39.417818] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:44.170 [2024-07-23 17:12:39.417837] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:44.170 [2024-07-23 17:12:39.417912] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:44.170 [2024-07-23 17:12:39.417933] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:44.170 pt2 00:18:44.170 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:44.429 [2024-07-23 17:12:39.674078] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.429 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.688 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.688 "name": "raid_bdev1", 00:18:44.688 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:44.688 "strip_size_kb": 64, 00:18:44.688 "state": "configuring", 00:18:44.688 "raid_level": "raid0", 00:18:44.688 "superblock": true, 00:18:44.688 "num_base_bdevs": 3, 00:18:44.688 "num_base_bdevs_discovered": 1, 00:18:44.688 "num_base_bdevs_operational": 3, 00:18:44.688 "base_bdevs_list": [ 00:18:44.688 { 00:18:44.688 "name": "pt1", 00:18:44.688 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:44.688 "is_configured": true, 00:18:44.688 "data_offset": 2048, 00:18:44.688 "data_size": 63488 00:18:44.688 }, 00:18:44.688 { 00:18:44.688 "name": null, 00:18:44.688 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:44.688 "is_configured": false, 00:18:44.688 "data_offset": 2048, 00:18:44.688 "data_size": 63488 00:18:44.688 }, 00:18:44.688 { 00:18:44.688 "name": null, 00:18:44.688 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:44.688 "is_configured": false, 00:18:44.688 "data_offset": 2048, 00:18:44.688 "data_size": 63488 00:18:44.688 } 00:18:44.688 ] 00:18:44.688 }' 00:18:44.688 17:12:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.688 17:12:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.255 17:12:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:45.255 17:12:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:45.255 17:12:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:45.513 [2024-07-23 17:12:40.785031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:45.513 [2024-07-23 17:12:40.785083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:45.513 [2024-07-23 17:12:40.785102] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1104860 00:18:45.513 [2024-07-23 17:12:40.785114] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:45.513 [2024-07-23 17:12:40.785450] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:45.513 [2024-07-23 17:12:40.785470] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:45.513 [2024-07-23 17:12:40.785533] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:45.513 [2024-07-23 17:12:40.785552] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:45.513 pt2 00:18:45.513 17:12:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:45.514 17:12:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:45.514 17:12:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:45.772 [2024-07-23 17:12:41.033697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:45.772 [2024-07-23 17:12:41.033730] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:45.772 [2024-07-23 17:12:41.033746] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1106300 00:18:45.772 [2024-07-23 17:12:41.033758] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:45.772 [2024-07-23 17:12:41.034044] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:45.772 [2024-07-23 17:12:41.034062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:45.772 [2024-07-23 17:12:41.034111] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:45.772 [2024-07-23 17:12:41.034128] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:45.772 [2024-07-23 17:12:41.034226] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1104b80 00:18:45.772 [2024-07-23 17:12:41.034237] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:45.772 [2024-07-23 17:12:41.034402] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf705e0 00:18:45.772 [2024-07-23 17:12:41.034522] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1104b80 00:18:45.772 [2024-07-23 17:12:41.034531] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1104b80 00:18:45.772 [2024-07-23 17:12:41.034621] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:45.772 pt3 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.772 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.773 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.773 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.773 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.773 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.032 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.032 "name": "raid_bdev1", 00:18:46.032 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:46.032 "strip_size_kb": 64, 00:18:46.032 "state": "online", 00:18:46.032 "raid_level": "raid0", 00:18:46.032 "superblock": true, 00:18:46.032 "num_base_bdevs": 3, 00:18:46.032 "num_base_bdevs_discovered": 3, 00:18:46.032 "num_base_bdevs_operational": 3, 00:18:46.032 "base_bdevs_list": [ 00:18:46.032 { 00:18:46.032 "name": "pt1", 00:18:46.032 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:46.032 "is_configured": true, 00:18:46.032 "data_offset": 2048, 00:18:46.032 "data_size": 63488 00:18:46.032 }, 00:18:46.032 { 00:18:46.032 "name": "pt2", 00:18:46.032 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:46.032 "is_configured": true, 00:18:46.032 "data_offset": 2048, 00:18:46.032 "data_size": 63488 00:18:46.032 }, 00:18:46.032 { 00:18:46.032 "name": "pt3", 00:18:46.032 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:46.032 "is_configured": true, 00:18:46.032 "data_offset": 2048, 00:18:46.032 "data_size": 63488 00:18:46.032 } 00:18:46.032 ] 00:18:46.032 }' 00:18:46.032 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.032 17:12:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:46.599 17:12:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:46.857 [2024-07-23 17:12:42.128882] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:46.857 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:46.857 "name": "raid_bdev1", 00:18:46.857 "aliases": [ 00:18:46.857 "cb6233d5-a275-4777-beb7-2ce04ade60d8" 00:18:46.857 ], 00:18:46.857 "product_name": "Raid Volume", 00:18:46.857 "block_size": 512, 00:18:46.857 "num_blocks": 190464, 00:18:46.857 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:46.857 "assigned_rate_limits": { 00:18:46.857 "rw_ios_per_sec": 0, 00:18:46.857 "rw_mbytes_per_sec": 0, 00:18:46.857 "r_mbytes_per_sec": 0, 00:18:46.857 "w_mbytes_per_sec": 0 00:18:46.857 }, 00:18:46.857 "claimed": false, 00:18:46.857 "zoned": false, 00:18:46.857 "supported_io_types": { 00:18:46.857 "read": true, 00:18:46.857 "write": true, 00:18:46.857 "unmap": true, 00:18:46.857 "flush": true, 00:18:46.857 "reset": true, 00:18:46.857 "nvme_admin": false, 00:18:46.857 "nvme_io": false, 00:18:46.857 "nvme_io_md": false, 00:18:46.857 "write_zeroes": true, 00:18:46.857 "zcopy": false, 00:18:46.857 "get_zone_info": false, 00:18:46.857 "zone_management": false, 00:18:46.857 "zone_append": false, 00:18:46.857 "compare": false, 00:18:46.857 "compare_and_write": false, 00:18:46.857 "abort": false, 00:18:46.857 "seek_hole": false, 00:18:46.857 "seek_data": false, 00:18:46.857 "copy": false, 00:18:46.857 "nvme_iov_md": false 00:18:46.857 }, 00:18:46.857 "memory_domains": [ 00:18:46.857 { 00:18:46.857 "dma_device_id": "system", 00:18:46.857 "dma_device_type": 1 00:18:46.857 }, 00:18:46.857 { 00:18:46.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.857 "dma_device_type": 2 00:18:46.857 }, 00:18:46.857 { 00:18:46.857 "dma_device_id": "system", 00:18:46.857 "dma_device_type": 1 00:18:46.857 }, 00:18:46.857 { 00:18:46.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.857 "dma_device_type": 2 00:18:46.857 }, 00:18:46.857 { 00:18:46.857 "dma_device_id": "system", 00:18:46.857 "dma_device_type": 1 00:18:46.857 }, 00:18:46.857 { 00:18:46.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.857 "dma_device_type": 2 00:18:46.857 } 00:18:46.857 ], 00:18:46.857 "driver_specific": { 00:18:46.857 "raid": { 00:18:46.857 "uuid": "cb6233d5-a275-4777-beb7-2ce04ade60d8", 00:18:46.857 "strip_size_kb": 64, 00:18:46.858 "state": "online", 00:18:46.858 "raid_level": "raid0", 00:18:46.858 "superblock": true, 00:18:46.858 "num_base_bdevs": 3, 00:18:46.858 "num_base_bdevs_discovered": 3, 00:18:46.858 "num_base_bdevs_operational": 3, 00:18:46.858 "base_bdevs_list": [ 00:18:46.858 { 00:18:46.858 "name": "pt1", 00:18:46.858 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:46.858 "is_configured": true, 00:18:46.858 "data_offset": 2048, 00:18:46.858 "data_size": 63488 00:18:46.858 }, 00:18:46.858 { 00:18:46.858 "name": "pt2", 00:18:46.858 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:46.858 "is_configured": true, 00:18:46.858 "data_offset": 2048, 00:18:46.858 "data_size": 63488 00:18:46.858 }, 00:18:46.858 { 00:18:46.858 "name": "pt3", 00:18:46.858 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:46.858 "is_configured": true, 00:18:46.858 "data_offset": 2048, 00:18:46.858 "data_size": 63488 00:18:46.858 } 00:18:46.858 ] 00:18:46.858 } 00:18:46.858 } 00:18:46.858 }' 00:18:46.858 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:46.858 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:46.858 pt2 00:18:46.858 pt3' 00:18:46.858 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.858 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:46.858 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.118 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.118 "name": "pt1", 00:18:47.118 "aliases": [ 00:18:47.118 "00000000-0000-0000-0000-000000000001" 00:18:47.118 ], 00:18:47.118 "product_name": "passthru", 00:18:47.118 "block_size": 512, 00:18:47.118 "num_blocks": 65536, 00:18:47.119 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:47.119 "assigned_rate_limits": { 00:18:47.119 "rw_ios_per_sec": 0, 00:18:47.119 "rw_mbytes_per_sec": 0, 00:18:47.119 "r_mbytes_per_sec": 0, 00:18:47.119 "w_mbytes_per_sec": 0 00:18:47.119 }, 00:18:47.119 "claimed": true, 00:18:47.119 "claim_type": "exclusive_write", 00:18:47.119 "zoned": false, 00:18:47.119 "supported_io_types": { 00:18:47.119 "read": true, 00:18:47.119 "write": true, 00:18:47.119 "unmap": true, 00:18:47.119 "flush": true, 00:18:47.119 "reset": true, 00:18:47.119 "nvme_admin": false, 00:18:47.119 "nvme_io": false, 00:18:47.119 "nvme_io_md": false, 00:18:47.119 "write_zeroes": true, 00:18:47.119 "zcopy": true, 00:18:47.119 "get_zone_info": false, 00:18:47.119 "zone_management": false, 00:18:47.119 "zone_append": false, 00:18:47.119 "compare": false, 00:18:47.119 "compare_and_write": false, 00:18:47.119 "abort": true, 00:18:47.119 "seek_hole": false, 00:18:47.119 "seek_data": false, 00:18:47.119 "copy": true, 00:18:47.119 "nvme_iov_md": false 00:18:47.119 }, 00:18:47.119 "memory_domains": [ 00:18:47.119 { 00:18:47.119 "dma_device_id": "system", 00:18:47.119 "dma_device_type": 1 00:18:47.119 }, 00:18:47.119 { 00:18:47.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.119 "dma_device_type": 2 00:18:47.119 } 00:18:47.119 ], 00:18:47.119 "driver_specific": { 00:18:47.119 "passthru": { 00:18:47.119 "name": "pt1", 00:18:47.119 "base_bdev_name": "malloc1" 00:18:47.119 } 00:18:47.119 } 00:18:47.119 }' 00:18:47.119 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.119 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.377 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.636 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.636 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:47.636 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.636 17:12:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:47.636 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.636 "name": "pt2", 00:18:47.636 "aliases": [ 00:18:47.636 "00000000-0000-0000-0000-000000000002" 00:18:47.636 ], 00:18:47.636 "product_name": "passthru", 00:18:47.636 "block_size": 512, 00:18:47.636 "num_blocks": 65536, 00:18:47.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:47.636 "assigned_rate_limits": { 00:18:47.636 "rw_ios_per_sec": 0, 00:18:47.636 "rw_mbytes_per_sec": 0, 00:18:47.636 "r_mbytes_per_sec": 0, 00:18:47.636 "w_mbytes_per_sec": 0 00:18:47.636 }, 00:18:47.636 "claimed": true, 00:18:47.636 "claim_type": "exclusive_write", 00:18:47.636 "zoned": false, 00:18:47.636 "supported_io_types": { 00:18:47.636 "read": true, 00:18:47.636 "write": true, 00:18:47.636 "unmap": true, 00:18:47.636 "flush": true, 00:18:47.636 "reset": true, 00:18:47.636 "nvme_admin": false, 00:18:47.636 "nvme_io": false, 00:18:47.636 "nvme_io_md": false, 00:18:47.636 "write_zeroes": true, 00:18:47.636 "zcopy": true, 00:18:47.636 "get_zone_info": false, 00:18:47.636 "zone_management": false, 00:18:47.636 "zone_append": false, 00:18:47.636 "compare": false, 00:18:47.636 "compare_and_write": false, 00:18:47.636 "abort": true, 00:18:47.636 "seek_hole": false, 00:18:47.636 "seek_data": false, 00:18:47.636 "copy": true, 00:18:47.636 "nvme_iov_md": false 00:18:47.636 }, 00:18:47.636 "memory_domains": [ 00:18:47.636 { 00:18:47.636 "dma_device_id": "system", 00:18:47.636 "dma_device_type": 1 00:18:47.636 }, 00:18:47.636 { 00:18:47.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.636 "dma_device_type": 2 00:18:47.636 } 00:18:47.636 ], 00:18:47.636 "driver_specific": { 00:18:47.636 "passthru": { 00:18:47.636 "name": "pt2", 00:18:47.636 "base_bdev_name": "malloc2" 00:18:47.636 } 00:18:47.636 } 00:18:47.636 }' 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.895 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:48.154 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.412 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.412 "name": "pt3", 00:18:48.412 "aliases": [ 00:18:48.412 "00000000-0000-0000-0000-000000000003" 00:18:48.412 ], 00:18:48.412 "product_name": "passthru", 00:18:48.412 "block_size": 512, 00:18:48.412 "num_blocks": 65536, 00:18:48.412 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:48.412 "assigned_rate_limits": { 00:18:48.412 "rw_ios_per_sec": 0, 00:18:48.412 "rw_mbytes_per_sec": 0, 00:18:48.412 "r_mbytes_per_sec": 0, 00:18:48.412 "w_mbytes_per_sec": 0 00:18:48.412 }, 00:18:48.412 "claimed": true, 00:18:48.412 "claim_type": "exclusive_write", 00:18:48.412 "zoned": false, 00:18:48.412 "supported_io_types": { 00:18:48.412 "read": true, 00:18:48.412 "write": true, 00:18:48.412 "unmap": true, 00:18:48.412 "flush": true, 00:18:48.412 "reset": true, 00:18:48.412 "nvme_admin": false, 00:18:48.412 "nvme_io": false, 00:18:48.412 "nvme_io_md": false, 00:18:48.412 "write_zeroes": true, 00:18:48.412 "zcopy": true, 00:18:48.412 "get_zone_info": false, 00:18:48.412 "zone_management": false, 00:18:48.412 "zone_append": false, 00:18:48.412 "compare": false, 00:18:48.412 "compare_and_write": false, 00:18:48.412 "abort": true, 00:18:48.412 "seek_hole": false, 00:18:48.412 "seek_data": false, 00:18:48.412 "copy": true, 00:18:48.412 "nvme_iov_md": false 00:18:48.412 }, 00:18:48.412 "memory_domains": [ 00:18:48.412 { 00:18:48.412 "dma_device_id": "system", 00:18:48.412 "dma_device_type": 1 00:18:48.412 }, 00:18:48.412 { 00:18:48.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.412 "dma_device_type": 2 00:18:48.412 } 00:18:48.412 ], 00:18:48.412 "driver_specific": { 00:18:48.412 "passthru": { 00:18:48.412 "name": "pt3", 00:18:48.412 "base_bdev_name": "malloc3" 00:18:48.412 } 00:18:48.412 } 00:18:48.412 }' 00:18:48.412 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.412 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.412 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:48.412 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.412 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.671 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:48.671 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.671 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.671 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.671 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.671 17:12:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.671 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.671 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:48.671 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:48.930 [2024-07-23 17:12:44.190351] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' cb6233d5-a275-4777-beb7-2ce04ade60d8 '!=' cb6233d5-a275-4777-beb7-2ce04ade60d8 ']' 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4138519 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4138519 ']' 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4138519 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4138519 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4138519' 00:18:48.930 killing process with pid 4138519 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4138519 00:18:48.930 [2024-07-23 17:12:44.267611] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:48.930 [2024-07-23 17:12:44.267663] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:48.930 [2024-07-23 17:12:44.267715] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:48.930 [2024-07-23 17:12:44.267726] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1104b80 name raid_bdev1, state offline 00:18:48.930 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4138519 00:18:48.930 [2024-07-23 17:12:44.296351] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:49.257 17:12:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:49.257 00:18:49.257 real 0m14.887s 00:18:49.257 user 0m27.232s 00:18:49.257 sys 0m2.788s 00:18:49.257 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:49.257 17:12:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.257 ************************************ 00:18:49.257 END TEST raid_superblock_test 00:18:49.257 ************************************ 00:18:49.257 17:12:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:49.257 17:12:44 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:18:49.257 17:12:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:49.257 17:12:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:49.257 17:12:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:49.257 ************************************ 00:18:49.257 START TEST raid_read_error_test 00:18:49.257 ************************************ 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ffXmM685z0 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4140740 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4140740 /var/tmp/spdk-raid.sock 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4140740 ']' 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:49.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:49.257 17:12:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.257 [2024-07-23 17:12:44.665050] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:18:49.257 [2024-07-23 17:12:44.665118] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4140740 ] 00:18:49.516 [2024-07-23 17:12:44.798864] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:49.516 [2024-07-23 17:12:44.852974] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:49.516 [2024-07-23 17:12:44.914200] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:49.516 [2024-07-23 17:12:44.914239] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:50.454 17:12:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:50.454 17:12:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:50.454 17:12:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:50.454 17:12:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:50.454 BaseBdev1_malloc 00:18:50.454 17:12:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:50.713 true 00:18:50.713 17:12:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:50.972 [2024-07-23 17:12:46.316746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:50.972 [2024-07-23 17:12:46.316793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.972 [2024-07-23 17:12:46.316812] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fda5c0 00:18:50.972 [2024-07-23 17:12:46.316825] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.972 [2024-07-23 17:12:46.318486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.972 [2024-07-23 17:12:46.318518] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:50.972 BaseBdev1 00:18:50.972 17:12:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:50.973 17:12:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:51.232 BaseBdev2_malloc 00:18:51.232 17:12:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:51.491 true 00:18:51.491 17:12:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:51.751 [2024-07-23 17:12:47.055330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:51.751 [2024-07-23 17:12:47.055374] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.751 [2024-07-23 17:12:47.055396] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fd4620 00:18:51.751 [2024-07-23 17:12:47.055415] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.751 [2024-07-23 17:12:47.057020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.751 [2024-07-23 17:12:47.057053] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:51.751 BaseBdev2 00:18:51.751 17:12:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:51.751 17:12:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:52.010 BaseBdev3_malloc 00:18:52.010 17:12:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:52.269 true 00:18:52.269 17:12:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:52.528 [2024-07-23 17:12:47.787018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:52.528 [2024-07-23 17:12:47.787063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.528 [2024-07-23 17:12:47.787085] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fd4c00 00:18:52.528 [2024-07-23 17:12:47.787098] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.528 [2024-07-23 17:12:47.788635] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.529 [2024-07-23 17:12:47.788662] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:52.529 BaseBdev3 00:18:52.529 17:12:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:52.788 [2024-07-23 17:12:48.031706] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:52.788 [2024-07-23 17:12:48.033116] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:52.788 [2024-07-23 17:12:48.033180] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:52.788 [2024-07-23 17:12:48.033376] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fd6670 00:18:52.788 [2024-07-23 17:12:48.033388] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:52.788 [2024-07-23 17:12:48.033579] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e267d0 00:18:52.788 [2024-07-23 17:12:48.033724] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fd6670 00:18:52.788 [2024-07-23 17:12:48.033734] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fd6670 00:18:52.788 [2024-07-23 17:12:48.033838] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.788 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.049 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.049 "name": "raid_bdev1", 00:18:53.049 "uuid": "cd551b32-22ee-43ee-ad3b-9aec6b4a169f", 00:18:53.049 "strip_size_kb": 64, 00:18:53.049 "state": "online", 00:18:53.049 "raid_level": "raid0", 00:18:53.049 "superblock": true, 00:18:53.049 "num_base_bdevs": 3, 00:18:53.049 "num_base_bdevs_discovered": 3, 00:18:53.049 "num_base_bdevs_operational": 3, 00:18:53.049 "base_bdevs_list": [ 00:18:53.049 { 00:18:53.049 "name": "BaseBdev1", 00:18:53.049 "uuid": "fbbb0ef2-1f04-5ff6-a477-1c7a8f9ce7de", 00:18:53.049 "is_configured": true, 00:18:53.049 "data_offset": 2048, 00:18:53.049 "data_size": 63488 00:18:53.049 }, 00:18:53.049 { 00:18:53.049 "name": "BaseBdev2", 00:18:53.049 "uuid": "55aa7008-d481-59b5-a61e-95841a4cf0c8", 00:18:53.049 "is_configured": true, 00:18:53.049 "data_offset": 2048, 00:18:53.049 "data_size": 63488 00:18:53.049 }, 00:18:53.049 { 00:18:53.049 "name": "BaseBdev3", 00:18:53.049 "uuid": "69152252-ef8c-5eef-949c-7db715396dd8", 00:18:53.049 "is_configured": true, 00:18:53.049 "data_offset": 2048, 00:18:53.049 "data_size": 63488 00:18:53.049 } 00:18:53.049 ] 00:18:53.049 }' 00:18:53.049 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.049 17:12:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.616 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:53.616 17:12:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:53.616 [2024-07-23 17:12:48.942352] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fd7f40 00:18:54.554 17:12:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.814 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.073 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.073 "name": "raid_bdev1", 00:18:55.073 "uuid": "cd551b32-22ee-43ee-ad3b-9aec6b4a169f", 00:18:55.073 "strip_size_kb": 64, 00:18:55.073 "state": "online", 00:18:55.073 "raid_level": "raid0", 00:18:55.073 "superblock": true, 00:18:55.073 "num_base_bdevs": 3, 00:18:55.073 "num_base_bdevs_discovered": 3, 00:18:55.073 "num_base_bdevs_operational": 3, 00:18:55.073 "base_bdevs_list": [ 00:18:55.073 { 00:18:55.073 "name": "BaseBdev1", 00:18:55.073 "uuid": "fbbb0ef2-1f04-5ff6-a477-1c7a8f9ce7de", 00:18:55.073 "is_configured": true, 00:18:55.073 "data_offset": 2048, 00:18:55.073 "data_size": 63488 00:18:55.073 }, 00:18:55.073 { 00:18:55.073 "name": "BaseBdev2", 00:18:55.073 "uuid": "55aa7008-d481-59b5-a61e-95841a4cf0c8", 00:18:55.073 "is_configured": true, 00:18:55.073 "data_offset": 2048, 00:18:55.073 "data_size": 63488 00:18:55.073 }, 00:18:55.073 { 00:18:55.073 "name": "BaseBdev3", 00:18:55.073 "uuid": "69152252-ef8c-5eef-949c-7db715396dd8", 00:18:55.073 "is_configured": true, 00:18:55.073 "data_offset": 2048, 00:18:55.073 "data_size": 63488 00:18:55.073 } 00:18:55.073 ] 00:18:55.073 }' 00:18:55.073 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.073 17:12:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.642 17:12:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:55.902 [2024-07-23 17:12:51.198805] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:55.902 [2024-07-23 17:12:51.198843] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:55.902 [2024-07-23 17:12:51.202008] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:55.902 [2024-07-23 17:12:51.202043] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:55.902 [2024-07-23 17:12:51.202075] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:55.902 [2024-07-23 17:12:51.202085] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fd6670 name raid_bdev1, state offline 00:18:55.902 0 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4140740 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4140740 ']' 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4140740 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4140740 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4140740' 00:18:55.902 killing process with pid 4140740 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4140740 00:18:55.902 [2024-07-23 17:12:51.283820] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:55.902 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4140740 00:18:55.902 [2024-07-23 17:12:51.305231] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ffXmM685z0 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:18:56.162 00:18:56.162 real 0m6.936s 00:18:56.162 user 0m10.934s 00:18:56.162 sys 0m1.262s 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:56.162 17:12:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.162 ************************************ 00:18:56.162 END TEST raid_read_error_test 00:18:56.162 ************************************ 00:18:56.162 17:12:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:56.162 17:12:51 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:18:56.162 17:12:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:56.162 17:12:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:56.162 17:12:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:56.422 ************************************ 00:18:56.422 START TEST raid_write_error_test 00:18:56.422 ************************************ 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vviGDtUjmp 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4141729 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4141729 /var/tmp/spdk-raid.sock 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4141729 ']' 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:56.422 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:56.422 17:12:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.422 [2024-07-23 17:12:51.696212] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:18:56.422 [2024-07-23 17:12:51.696286] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4141729 ] 00:18:56.422 [2024-07-23 17:12:51.832263] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:56.682 [2024-07-23 17:12:51.887789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:56.682 [2024-07-23 17:12:51.944472] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:56.682 [2024-07-23 17:12:51.944505] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.250 17:12:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:57.250 17:12:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:57.250 17:12:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:57.250 17:12:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:57.508 BaseBdev1_malloc 00:18:57.509 17:12:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:57.768 true 00:18:57.768 17:12:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:58.027 [2024-07-23 17:12:53.297468] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:58.027 [2024-07-23 17:12:53.297513] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.027 [2024-07-23 17:12:53.297532] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217a5c0 00:18:58.027 [2024-07-23 17:12:53.297545] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.027 [2024-07-23 17:12:53.299028] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.027 [2024-07-23 17:12:53.299053] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:58.027 BaseBdev1 00:18:58.027 17:12:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:58.027 17:12:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:58.285 BaseBdev2_malloc 00:18:58.285 17:12:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:58.543 true 00:18:58.543 17:12:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:58.802 [2024-07-23 17:12:53.979802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:58.802 [2024-07-23 17:12:53.979845] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.802 [2024-07-23 17:12:53.979868] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2174620 00:18:58.802 [2024-07-23 17:12:53.979881] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.802 [2024-07-23 17:12:53.981290] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.802 [2024-07-23 17:12:53.981316] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:58.802 BaseBdev2 00:18:58.802 17:12:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:58.802 17:12:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:59.061 BaseBdev3_malloc 00:18:59.061 17:12:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:59.320 true 00:18:59.320 17:12:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:59.320 [2024-07-23 17:12:54.730294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:59.320 [2024-07-23 17:12:54.730334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.320 [2024-07-23 17:12:54.730354] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2174c00 00:18:59.320 [2024-07-23 17:12:54.730367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.320 [2024-07-23 17:12:54.731762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.320 [2024-07-23 17:12:54.731789] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:59.320 BaseBdev3 00:18:59.579 17:12:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:59.579 [2024-07-23 17:12:54.978991] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:59.579 [2024-07-23 17:12:54.980334] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:59.579 [2024-07-23 17:12:54.980397] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:59.579 [2024-07-23 17:12:54.980592] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2176670 00:18:59.579 [2024-07-23 17:12:54.980604] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:59.579 [2024-07-23 17:12:54.980795] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fc67d0 00:18:59.579 [2024-07-23 17:12:54.980951] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2176670 00:18:59.579 [2024-07-23 17:12:54.980962] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2176670 00:18:59.579 [2024-07-23 17:12:54.981058] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.838 "name": "raid_bdev1", 00:18:59.838 "uuid": "392bb6a6-e63e-4a88-98d2-3824dee9ed90", 00:18:59.838 "strip_size_kb": 64, 00:18:59.838 "state": "online", 00:18:59.838 "raid_level": "raid0", 00:18:59.838 "superblock": true, 00:18:59.838 "num_base_bdevs": 3, 00:18:59.838 "num_base_bdevs_discovered": 3, 00:18:59.838 "num_base_bdevs_operational": 3, 00:18:59.838 "base_bdevs_list": [ 00:18:59.838 { 00:18:59.838 "name": "BaseBdev1", 00:18:59.838 "uuid": "6aef1e1a-98ef-5997-8e71-1659f2b156f4", 00:18:59.838 "is_configured": true, 00:18:59.838 "data_offset": 2048, 00:18:59.838 "data_size": 63488 00:18:59.838 }, 00:18:59.838 { 00:18:59.838 "name": "BaseBdev2", 00:18:59.838 "uuid": "3c390e66-0984-51c1-b4d5-0ae88e7e61b6", 00:18:59.838 "is_configured": true, 00:18:59.838 "data_offset": 2048, 00:18:59.838 "data_size": 63488 00:18:59.838 }, 00:18:59.838 { 00:18:59.838 "name": "BaseBdev3", 00:18:59.838 "uuid": "32520c98-a7ad-531e-8c24-c34ea3b4d1f4", 00:18:59.838 "is_configured": true, 00:18:59.838 "data_offset": 2048, 00:18:59.838 "data_size": 63488 00:18:59.838 } 00:18:59.838 ] 00:18:59.838 }' 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.838 17:12:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:00.775 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:00.775 17:12:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:00.775 [2024-07-23 17:12:55.949815] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2177f40 00:19:01.712 17:12:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.712 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:01.971 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.971 "name": "raid_bdev1", 00:19:01.971 "uuid": "392bb6a6-e63e-4a88-98d2-3824dee9ed90", 00:19:01.971 "strip_size_kb": 64, 00:19:01.971 "state": "online", 00:19:01.971 "raid_level": "raid0", 00:19:01.971 "superblock": true, 00:19:01.971 "num_base_bdevs": 3, 00:19:01.971 "num_base_bdevs_discovered": 3, 00:19:01.971 "num_base_bdevs_operational": 3, 00:19:01.971 "base_bdevs_list": [ 00:19:01.971 { 00:19:01.971 "name": "BaseBdev1", 00:19:01.971 "uuid": "6aef1e1a-98ef-5997-8e71-1659f2b156f4", 00:19:01.971 "is_configured": true, 00:19:01.971 "data_offset": 2048, 00:19:01.971 "data_size": 63488 00:19:01.971 }, 00:19:01.971 { 00:19:01.971 "name": "BaseBdev2", 00:19:01.971 "uuid": "3c390e66-0984-51c1-b4d5-0ae88e7e61b6", 00:19:01.971 "is_configured": true, 00:19:01.971 "data_offset": 2048, 00:19:01.971 "data_size": 63488 00:19:01.971 }, 00:19:01.971 { 00:19:01.971 "name": "BaseBdev3", 00:19:01.971 "uuid": "32520c98-a7ad-531e-8c24-c34ea3b4d1f4", 00:19:01.971 "is_configured": true, 00:19:01.971 "data_offset": 2048, 00:19:01.971 "data_size": 63488 00:19:01.971 } 00:19:01.971 ] 00:19:01.971 }' 00:19:01.971 17:12:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.971 17:12:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:02.907 [2024-07-23 17:12:58.243956] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:02.907 [2024-07-23 17:12:58.243998] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:02.907 [2024-07-23 17:12:58.247196] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:02.907 [2024-07-23 17:12:58.247230] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:02.907 [2024-07-23 17:12:58.247263] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:02.907 [2024-07-23 17:12:58.247273] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2176670 name raid_bdev1, state offline 00:19:02.907 0 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4141729 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4141729 ']' 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4141729 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4141729 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4141729' 00:19:02.907 killing process with pid 4141729 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4141729 00:19:02.907 [2024-07-23 17:12:58.311957] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:02.907 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4141729 00:19:03.166 [2024-07-23 17:12:58.336245] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vviGDtUjmp 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:19:03.166 00:19:03.166 real 0m6.943s 00:19:03.166 user 0m11.002s 00:19:03.166 sys 0m1.258s 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:03.166 17:12:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.166 ************************************ 00:19:03.166 END TEST raid_write_error_test 00:19:03.166 ************************************ 00:19:03.425 17:12:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:03.425 17:12:58 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:03.425 17:12:58 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:19:03.425 17:12:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:03.425 17:12:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:03.425 17:12:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:03.425 ************************************ 00:19:03.425 START TEST raid_state_function_test 00:19:03.425 ************************************ 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:03.425 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4142699 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4142699' 00:19:03.426 Process raid pid: 4142699 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4142699 /var/tmp/spdk-raid.sock 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4142699 ']' 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:03.426 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:03.426 17:12:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.426 [2024-07-23 17:12:58.714324] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:19:03.426 [2024-07-23 17:12:58.714393] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:03.426 [2024-07-23 17:12:58.846015] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:03.689 [2024-07-23 17:12:58.897419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:03.689 [2024-07-23 17:12:58.954819] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:03.689 [2024-07-23 17:12:58.954844] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:03.993 17:12:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:03.993 17:12:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:03.993 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:04.253 [2024-07-23 17:12:59.404687] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:04.253 [2024-07-23 17:12:59.404726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:04.253 [2024-07-23 17:12:59.404737] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:04.253 [2024-07-23 17:12:59.404748] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:04.253 [2024-07-23 17:12:59.404757] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:04.253 [2024-07-23 17:12:59.404769] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.253 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.511 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.511 "name": "Existed_Raid", 00:19:04.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.511 "strip_size_kb": 64, 00:19:04.511 "state": "configuring", 00:19:04.511 "raid_level": "concat", 00:19:04.511 "superblock": false, 00:19:04.511 "num_base_bdevs": 3, 00:19:04.511 "num_base_bdevs_discovered": 0, 00:19:04.511 "num_base_bdevs_operational": 3, 00:19:04.511 "base_bdevs_list": [ 00:19:04.511 { 00:19:04.511 "name": "BaseBdev1", 00:19:04.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.512 "is_configured": false, 00:19:04.512 "data_offset": 0, 00:19:04.512 "data_size": 0 00:19:04.512 }, 00:19:04.512 { 00:19:04.512 "name": "BaseBdev2", 00:19:04.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.512 "is_configured": false, 00:19:04.512 "data_offset": 0, 00:19:04.512 "data_size": 0 00:19:04.512 }, 00:19:04.512 { 00:19:04.512 "name": "BaseBdev3", 00:19:04.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.512 "is_configured": false, 00:19:04.512 "data_offset": 0, 00:19:04.512 "data_size": 0 00:19:04.512 } 00:19:04.512 ] 00:19:04.512 }' 00:19:04.512 17:12:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.512 17:12:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.079 17:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:05.338 [2024-07-23 17:13:00.511501] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:05.338 [2024-07-23 17:13:00.511535] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa9280 name Existed_Raid, state configuring 00:19:05.338 17:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:05.597 [2024-07-23 17:13:00.760180] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:05.597 [2024-07-23 17:13:00.760212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:05.597 [2024-07-23 17:13:00.760221] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:05.597 [2024-07-23 17:13:00.760233] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:05.597 [2024-07-23 17:13:00.760242] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:05.597 [2024-07-23 17:13:00.760253] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:05.597 17:13:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:05.856 [2024-07-23 17:13:01.022563] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:05.856 BaseBdev1 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:05.856 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:06.115 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:06.115 [ 00:19:06.115 { 00:19:06.115 "name": "BaseBdev1", 00:19:06.115 "aliases": [ 00:19:06.115 "ad70735b-5ff5-4102-af0d-f0062c822b8e" 00:19:06.115 ], 00:19:06.115 "product_name": "Malloc disk", 00:19:06.115 "block_size": 512, 00:19:06.115 "num_blocks": 65536, 00:19:06.115 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:06.115 "assigned_rate_limits": { 00:19:06.115 "rw_ios_per_sec": 0, 00:19:06.115 "rw_mbytes_per_sec": 0, 00:19:06.115 "r_mbytes_per_sec": 0, 00:19:06.115 "w_mbytes_per_sec": 0 00:19:06.115 }, 00:19:06.115 "claimed": true, 00:19:06.115 "claim_type": "exclusive_write", 00:19:06.115 "zoned": false, 00:19:06.115 "supported_io_types": { 00:19:06.115 "read": true, 00:19:06.115 "write": true, 00:19:06.115 "unmap": true, 00:19:06.115 "flush": true, 00:19:06.115 "reset": true, 00:19:06.115 "nvme_admin": false, 00:19:06.115 "nvme_io": false, 00:19:06.115 "nvme_io_md": false, 00:19:06.115 "write_zeroes": true, 00:19:06.115 "zcopy": true, 00:19:06.115 "get_zone_info": false, 00:19:06.115 "zone_management": false, 00:19:06.115 "zone_append": false, 00:19:06.115 "compare": false, 00:19:06.115 "compare_and_write": false, 00:19:06.115 "abort": true, 00:19:06.115 "seek_hole": false, 00:19:06.115 "seek_data": false, 00:19:06.115 "copy": true, 00:19:06.115 "nvme_iov_md": false 00:19:06.115 }, 00:19:06.116 "memory_domains": [ 00:19:06.116 { 00:19:06.116 "dma_device_id": "system", 00:19:06.116 "dma_device_type": 1 00:19:06.116 }, 00:19:06.116 { 00:19:06.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.116 "dma_device_type": 2 00:19:06.116 } 00:19:06.116 ], 00:19:06.116 "driver_specific": {} 00:19:06.116 } 00:19:06.116 ] 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:06.116 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.375 "name": "Existed_Raid", 00:19:06.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.375 "strip_size_kb": 64, 00:19:06.375 "state": "configuring", 00:19:06.375 "raid_level": "concat", 00:19:06.375 "superblock": false, 00:19:06.375 "num_base_bdevs": 3, 00:19:06.375 "num_base_bdevs_discovered": 1, 00:19:06.375 "num_base_bdevs_operational": 3, 00:19:06.375 "base_bdevs_list": [ 00:19:06.375 { 00:19:06.375 "name": "BaseBdev1", 00:19:06.375 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:06.375 "is_configured": true, 00:19:06.375 "data_offset": 0, 00:19:06.375 "data_size": 65536 00:19:06.375 }, 00:19:06.375 { 00:19:06.375 "name": "BaseBdev2", 00:19:06.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.375 "is_configured": false, 00:19:06.375 "data_offset": 0, 00:19:06.375 "data_size": 0 00:19:06.375 }, 00:19:06.375 { 00:19:06.375 "name": "BaseBdev3", 00:19:06.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.375 "is_configured": false, 00:19:06.375 "data_offset": 0, 00:19:06.375 "data_size": 0 00:19:06.375 } 00:19:06.375 ] 00:19:06.375 }' 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.375 17:13:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.313 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:07.313 [2024-07-23 17:13:02.618787] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:07.313 [2024-07-23 17:13:02.618824] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa8bb0 name Existed_Raid, state configuring 00:19:07.313 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:07.572 [2024-07-23 17:13:02.867477] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:07.572 [2024-07-23 17:13:02.869010] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:07.572 [2024-07-23 17:13:02.869043] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:07.572 [2024-07-23 17:13:02.869054] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:07.572 [2024-07-23 17:13:02.869065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.572 17:13:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.832 17:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.832 "name": "Existed_Raid", 00:19:07.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.832 "strip_size_kb": 64, 00:19:07.832 "state": "configuring", 00:19:07.832 "raid_level": "concat", 00:19:07.832 "superblock": false, 00:19:07.832 "num_base_bdevs": 3, 00:19:07.832 "num_base_bdevs_discovered": 1, 00:19:07.832 "num_base_bdevs_operational": 3, 00:19:07.832 "base_bdevs_list": [ 00:19:07.832 { 00:19:07.832 "name": "BaseBdev1", 00:19:07.832 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:07.832 "is_configured": true, 00:19:07.832 "data_offset": 0, 00:19:07.832 "data_size": 65536 00:19:07.832 }, 00:19:07.832 { 00:19:07.832 "name": "BaseBdev2", 00:19:07.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.832 "is_configured": false, 00:19:07.832 "data_offset": 0, 00:19:07.832 "data_size": 0 00:19:07.832 }, 00:19:07.832 { 00:19:07.832 "name": "BaseBdev3", 00:19:07.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.832 "is_configured": false, 00:19:07.832 "data_offset": 0, 00:19:07.832 "data_size": 0 00:19:07.832 } 00:19:07.832 ] 00:19:07.832 }' 00:19:07.832 17:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.832 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.405 17:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:08.663 [2024-07-23 17:13:03.949789] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:08.663 BaseBdev2 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:08.663 17:13:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:08.922 17:13:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:09.181 [ 00:19:09.181 { 00:19:09.181 "name": "BaseBdev2", 00:19:09.181 "aliases": [ 00:19:09.181 "42601f3d-39d9-41de-a83d-5a9a66d4756d" 00:19:09.181 ], 00:19:09.181 "product_name": "Malloc disk", 00:19:09.181 "block_size": 512, 00:19:09.181 "num_blocks": 65536, 00:19:09.181 "uuid": "42601f3d-39d9-41de-a83d-5a9a66d4756d", 00:19:09.181 "assigned_rate_limits": { 00:19:09.181 "rw_ios_per_sec": 0, 00:19:09.181 "rw_mbytes_per_sec": 0, 00:19:09.181 "r_mbytes_per_sec": 0, 00:19:09.181 "w_mbytes_per_sec": 0 00:19:09.181 }, 00:19:09.181 "claimed": true, 00:19:09.181 "claim_type": "exclusive_write", 00:19:09.181 "zoned": false, 00:19:09.181 "supported_io_types": { 00:19:09.181 "read": true, 00:19:09.181 "write": true, 00:19:09.181 "unmap": true, 00:19:09.181 "flush": true, 00:19:09.181 "reset": true, 00:19:09.181 "nvme_admin": false, 00:19:09.181 "nvme_io": false, 00:19:09.181 "nvme_io_md": false, 00:19:09.181 "write_zeroes": true, 00:19:09.181 "zcopy": true, 00:19:09.181 "get_zone_info": false, 00:19:09.181 "zone_management": false, 00:19:09.181 "zone_append": false, 00:19:09.181 "compare": false, 00:19:09.181 "compare_and_write": false, 00:19:09.181 "abort": true, 00:19:09.181 "seek_hole": false, 00:19:09.181 "seek_data": false, 00:19:09.181 "copy": true, 00:19:09.181 "nvme_iov_md": false 00:19:09.181 }, 00:19:09.181 "memory_domains": [ 00:19:09.181 { 00:19:09.181 "dma_device_id": "system", 00:19:09.181 "dma_device_type": 1 00:19:09.181 }, 00:19:09.181 { 00:19:09.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.182 "dma_device_type": 2 00:19:09.182 } 00:19:09.182 ], 00:19:09.182 "driver_specific": {} 00:19:09.182 } 00:19:09.182 ] 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.182 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:09.441 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.441 "name": "Existed_Raid", 00:19:09.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.441 "strip_size_kb": 64, 00:19:09.441 "state": "configuring", 00:19:09.441 "raid_level": "concat", 00:19:09.441 "superblock": false, 00:19:09.441 "num_base_bdevs": 3, 00:19:09.441 "num_base_bdevs_discovered": 2, 00:19:09.441 "num_base_bdevs_operational": 3, 00:19:09.441 "base_bdevs_list": [ 00:19:09.441 { 00:19:09.441 "name": "BaseBdev1", 00:19:09.441 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:09.441 "is_configured": true, 00:19:09.441 "data_offset": 0, 00:19:09.441 "data_size": 65536 00:19:09.441 }, 00:19:09.441 { 00:19:09.441 "name": "BaseBdev2", 00:19:09.441 "uuid": "42601f3d-39d9-41de-a83d-5a9a66d4756d", 00:19:09.441 "is_configured": true, 00:19:09.441 "data_offset": 0, 00:19:09.441 "data_size": 65536 00:19:09.441 }, 00:19:09.441 { 00:19:09.441 "name": "BaseBdev3", 00:19:09.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.441 "is_configured": false, 00:19:09.441 "data_offset": 0, 00:19:09.441 "data_size": 0 00:19:09.441 } 00:19:09.441 ] 00:19:09.441 }' 00:19:09.441 17:13:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.441 17:13:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.009 17:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:10.268 [2024-07-23 17:13:05.529371] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:10.268 [2024-07-23 17:13:05.529407] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fa8800 00:19:10.268 [2024-07-23 17:13:05.529416] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:19:10.268 [2024-07-23 17:13:05.529662] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1facb50 00:19:10.268 [2024-07-23 17:13:05.529778] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fa8800 00:19:10.268 [2024-07-23 17:13:05.529788] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fa8800 00:19:10.268 [2024-07-23 17:13:05.529963] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:10.268 BaseBdev3 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:10.268 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:10.527 17:13:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:10.787 [ 00:19:10.787 { 00:19:10.787 "name": "BaseBdev3", 00:19:10.787 "aliases": [ 00:19:10.787 "24241ae4-a439-4f44-ba2f-a7fe786c216a" 00:19:10.787 ], 00:19:10.787 "product_name": "Malloc disk", 00:19:10.787 "block_size": 512, 00:19:10.787 "num_blocks": 65536, 00:19:10.787 "uuid": "24241ae4-a439-4f44-ba2f-a7fe786c216a", 00:19:10.787 "assigned_rate_limits": { 00:19:10.787 "rw_ios_per_sec": 0, 00:19:10.787 "rw_mbytes_per_sec": 0, 00:19:10.787 "r_mbytes_per_sec": 0, 00:19:10.787 "w_mbytes_per_sec": 0 00:19:10.787 }, 00:19:10.787 "claimed": true, 00:19:10.787 "claim_type": "exclusive_write", 00:19:10.787 "zoned": false, 00:19:10.787 "supported_io_types": { 00:19:10.787 "read": true, 00:19:10.787 "write": true, 00:19:10.787 "unmap": true, 00:19:10.787 "flush": true, 00:19:10.787 "reset": true, 00:19:10.787 "nvme_admin": false, 00:19:10.787 "nvme_io": false, 00:19:10.787 "nvme_io_md": false, 00:19:10.787 "write_zeroes": true, 00:19:10.787 "zcopy": true, 00:19:10.787 "get_zone_info": false, 00:19:10.787 "zone_management": false, 00:19:10.787 "zone_append": false, 00:19:10.787 "compare": false, 00:19:10.787 "compare_and_write": false, 00:19:10.787 "abort": true, 00:19:10.787 "seek_hole": false, 00:19:10.787 "seek_data": false, 00:19:10.787 "copy": true, 00:19:10.787 "nvme_iov_md": false 00:19:10.787 }, 00:19:10.787 "memory_domains": [ 00:19:10.787 { 00:19:10.787 "dma_device_id": "system", 00:19:10.787 "dma_device_type": 1 00:19:10.787 }, 00:19:10.787 { 00:19:10.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.787 "dma_device_type": 2 00:19:10.787 } 00:19:10.787 ], 00:19:10.787 "driver_specific": {} 00:19:10.787 } 00:19:10.787 ] 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.787 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.046 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.046 "name": "Existed_Raid", 00:19:11.046 "uuid": "53fb8337-22ff-4b95-a316-a9d16be14a2a", 00:19:11.046 "strip_size_kb": 64, 00:19:11.046 "state": "online", 00:19:11.046 "raid_level": "concat", 00:19:11.046 "superblock": false, 00:19:11.046 "num_base_bdevs": 3, 00:19:11.046 "num_base_bdevs_discovered": 3, 00:19:11.046 "num_base_bdevs_operational": 3, 00:19:11.046 "base_bdevs_list": [ 00:19:11.046 { 00:19:11.046 "name": "BaseBdev1", 00:19:11.046 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:11.046 "is_configured": true, 00:19:11.046 "data_offset": 0, 00:19:11.046 "data_size": 65536 00:19:11.046 }, 00:19:11.046 { 00:19:11.046 "name": "BaseBdev2", 00:19:11.046 "uuid": "42601f3d-39d9-41de-a83d-5a9a66d4756d", 00:19:11.046 "is_configured": true, 00:19:11.046 "data_offset": 0, 00:19:11.046 "data_size": 65536 00:19:11.046 }, 00:19:11.046 { 00:19:11.046 "name": "BaseBdev3", 00:19:11.046 "uuid": "24241ae4-a439-4f44-ba2f-a7fe786c216a", 00:19:11.046 "is_configured": true, 00:19:11.046 "data_offset": 0, 00:19:11.046 "data_size": 65536 00:19:11.046 } 00:19:11.046 ] 00:19:11.046 }' 00:19:11.046 17:13:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.046 17:13:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:11.983 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:12.243 [2024-07-23 17:13:07.414667] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:12.243 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:12.243 "name": "Existed_Raid", 00:19:12.243 "aliases": [ 00:19:12.243 "53fb8337-22ff-4b95-a316-a9d16be14a2a" 00:19:12.243 ], 00:19:12.243 "product_name": "Raid Volume", 00:19:12.243 "block_size": 512, 00:19:12.243 "num_blocks": 196608, 00:19:12.243 "uuid": "53fb8337-22ff-4b95-a316-a9d16be14a2a", 00:19:12.243 "assigned_rate_limits": { 00:19:12.243 "rw_ios_per_sec": 0, 00:19:12.243 "rw_mbytes_per_sec": 0, 00:19:12.243 "r_mbytes_per_sec": 0, 00:19:12.243 "w_mbytes_per_sec": 0 00:19:12.243 }, 00:19:12.243 "claimed": false, 00:19:12.243 "zoned": false, 00:19:12.243 "supported_io_types": { 00:19:12.243 "read": true, 00:19:12.243 "write": true, 00:19:12.243 "unmap": true, 00:19:12.243 "flush": true, 00:19:12.243 "reset": true, 00:19:12.243 "nvme_admin": false, 00:19:12.243 "nvme_io": false, 00:19:12.243 "nvme_io_md": false, 00:19:12.243 "write_zeroes": true, 00:19:12.243 "zcopy": false, 00:19:12.243 "get_zone_info": false, 00:19:12.243 "zone_management": false, 00:19:12.243 "zone_append": false, 00:19:12.243 "compare": false, 00:19:12.243 "compare_and_write": false, 00:19:12.243 "abort": false, 00:19:12.243 "seek_hole": false, 00:19:12.243 "seek_data": false, 00:19:12.243 "copy": false, 00:19:12.243 "nvme_iov_md": false 00:19:12.243 }, 00:19:12.243 "memory_domains": [ 00:19:12.243 { 00:19:12.243 "dma_device_id": "system", 00:19:12.243 "dma_device_type": 1 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.243 "dma_device_type": 2 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "dma_device_id": "system", 00:19:12.243 "dma_device_type": 1 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.243 "dma_device_type": 2 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "dma_device_id": "system", 00:19:12.243 "dma_device_type": 1 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.243 "dma_device_type": 2 00:19:12.243 } 00:19:12.243 ], 00:19:12.243 "driver_specific": { 00:19:12.243 "raid": { 00:19:12.243 "uuid": "53fb8337-22ff-4b95-a316-a9d16be14a2a", 00:19:12.243 "strip_size_kb": 64, 00:19:12.243 "state": "online", 00:19:12.243 "raid_level": "concat", 00:19:12.243 "superblock": false, 00:19:12.243 "num_base_bdevs": 3, 00:19:12.243 "num_base_bdevs_discovered": 3, 00:19:12.243 "num_base_bdevs_operational": 3, 00:19:12.243 "base_bdevs_list": [ 00:19:12.243 { 00:19:12.243 "name": "BaseBdev1", 00:19:12.243 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:12.243 "is_configured": true, 00:19:12.243 "data_offset": 0, 00:19:12.243 "data_size": 65536 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "name": "BaseBdev2", 00:19:12.243 "uuid": "42601f3d-39d9-41de-a83d-5a9a66d4756d", 00:19:12.243 "is_configured": true, 00:19:12.243 "data_offset": 0, 00:19:12.243 "data_size": 65536 00:19:12.243 }, 00:19:12.243 { 00:19:12.243 "name": "BaseBdev3", 00:19:12.243 "uuid": "24241ae4-a439-4f44-ba2f-a7fe786c216a", 00:19:12.243 "is_configured": true, 00:19:12.243 "data_offset": 0, 00:19:12.243 "data_size": 65536 00:19:12.243 } 00:19:12.243 ] 00:19:12.243 } 00:19:12.243 } 00:19:12.243 }' 00:19:12.243 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:12.243 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:12.243 BaseBdev2 00:19:12.243 BaseBdev3' 00:19:12.243 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.243 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:12.243 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.503 "name": "BaseBdev1", 00:19:12.503 "aliases": [ 00:19:12.503 "ad70735b-5ff5-4102-af0d-f0062c822b8e" 00:19:12.503 ], 00:19:12.503 "product_name": "Malloc disk", 00:19:12.503 "block_size": 512, 00:19:12.503 "num_blocks": 65536, 00:19:12.503 "uuid": "ad70735b-5ff5-4102-af0d-f0062c822b8e", 00:19:12.503 "assigned_rate_limits": { 00:19:12.503 "rw_ios_per_sec": 0, 00:19:12.503 "rw_mbytes_per_sec": 0, 00:19:12.503 "r_mbytes_per_sec": 0, 00:19:12.503 "w_mbytes_per_sec": 0 00:19:12.503 }, 00:19:12.503 "claimed": true, 00:19:12.503 "claim_type": "exclusive_write", 00:19:12.503 "zoned": false, 00:19:12.503 "supported_io_types": { 00:19:12.503 "read": true, 00:19:12.503 "write": true, 00:19:12.503 "unmap": true, 00:19:12.503 "flush": true, 00:19:12.503 "reset": true, 00:19:12.503 "nvme_admin": false, 00:19:12.503 "nvme_io": false, 00:19:12.503 "nvme_io_md": false, 00:19:12.503 "write_zeroes": true, 00:19:12.503 "zcopy": true, 00:19:12.503 "get_zone_info": false, 00:19:12.503 "zone_management": false, 00:19:12.503 "zone_append": false, 00:19:12.503 "compare": false, 00:19:12.503 "compare_and_write": false, 00:19:12.503 "abort": true, 00:19:12.503 "seek_hole": false, 00:19:12.503 "seek_data": false, 00:19:12.503 "copy": true, 00:19:12.503 "nvme_iov_md": false 00:19:12.503 }, 00:19:12.503 "memory_domains": [ 00:19:12.503 { 00:19:12.503 "dma_device_id": "system", 00:19:12.503 "dma_device_type": 1 00:19:12.503 }, 00:19:12.503 { 00:19:12.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.503 "dma_device_type": 2 00:19:12.503 } 00:19:12.503 ], 00:19:12.503 "driver_specific": {} 00:19:12.503 }' 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.503 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.762 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.762 17:13:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.762 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.762 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.762 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.762 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:12.762 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.021 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.021 "name": "BaseBdev2", 00:19:13.021 "aliases": [ 00:19:13.021 "42601f3d-39d9-41de-a83d-5a9a66d4756d" 00:19:13.021 ], 00:19:13.021 "product_name": "Malloc disk", 00:19:13.021 "block_size": 512, 00:19:13.021 "num_blocks": 65536, 00:19:13.021 "uuid": "42601f3d-39d9-41de-a83d-5a9a66d4756d", 00:19:13.021 "assigned_rate_limits": { 00:19:13.021 "rw_ios_per_sec": 0, 00:19:13.021 "rw_mbytes_per_sec": 0, 00:19:13.021 "r_mbytes_per_sec": 0, 00:19:13.021 "w_mbytes_per_sec": 0 00:19:13.021 }, 00:19:13.021 "claimed": true, 00:19:13.021 "claim_type": "exclusive_write", 00:19:13.021 "zoned": false, 00:19:13.021 "supported_io_types": { 00:19:13.021 "read": true, 00:19:13.021 "write": true, 00:19:13.021 "unmap": true, 00:19:13.021 "flush": true, 00:19:13.021 "reset": true, 00:19:13.021 "nvme_admin": false, 00:19:13.021 "nvme_io": false, 00:19:13.021 "nvme_io_md": false, 00:19:13.021 "write_zeroes": true, 00:19:13.021 "zcopy": true, 00:19:13.021 "get_zone_info": false, 00:19:13.021 "zone_management": false, 00:19:13.021 "zone_append": false, 00:19:13.021 "compare": false, 00:19:13.021 "compare_and_write": false, 00:19:13.021 "abort": true, 00:19:13.021 "seek_hole": false, 00:19:13.021 "seek_data": false, 00:19:13.021 "copy": true, 00:19:13.021 "nvme_iov_md": false 00:19:13.021 }, 00:19:13.021 "memory_domains": [ 00:19:13.021 { 00:19:13.021 "dma_device_id": "system", 00:19:13.021 "dma_device_type": 1 00:19:13.021 }, 00:19:13.021 { 00:19:13.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.021 "dma_device_type": 2 00:19:13.021 } 00:19:13.021 ], 00:19:13.021 "driver_specific": {} 00:19:13.021 }' 00:19:13.021 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.021 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.021 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.021 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:13.280 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.539 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.539 "name": "BaseBdev3", 00:19:13.539 "aliases": [ 00:19:13.539 "24241ae4-a439-4f44-ba2f-a7fe786c216a" 00:19:13.539 ], 00:19:13.539 "product_name": "Malloc disk", 00:19:13.539 "block_size": 512, 00:19:13.539 "num_blocks": 65536, 00:19:13.539 "uuid": "24241ae4-a439-4f44-ba2f-a7fe786c216a", 00:19:13.539 "assigned_rate_limits": { 00:19:13.539 "rw_ios_per_sec": 0, 00:19:13.539 "rw_mbytes_per_sec": 0, 00:19:13.539 "r_mbytes_per_sec": 0, 00:19:13.539 "w_mbytes_per_sec": 0 00:19:13.539 }, 00:19:13.539 "claimed": true, 00:19:13.539 "claim_type": "exclusive_write", 00:19:13.539 "zoned": false, 00:19:13.539 "supported_io_types": { 00:19:13.539 "read": true, 00:19:13.539 "write": true, 00:19:13.539 "unmap": true, 00:19:13.539 "flush": true, 00:19:13.539 "reset": true, 00:19:13.539 "nvme_admin": false, 00:19:13.539 "nvme_io": false, 00:19:13.539 "nvme_io_md": false, 00:19:13.539 "write_zeroes": true, 00:19:13.539 "zcopy": true, 00:19:13.539 "get_zone_info": false, 00:19:13.539 "zone_management": false, 00:19:13.539 "zone_append": false, 00:19:13.539 "compare": false, 00:19:13.539 "compare_and_write": false, 00:19:13.539 "abort": true, 00:19:13.539 "seek_hole": false, 00:19:13.539 "seek_data": false, 00:19:13.539 "copy": true, 00:19:13.539 "nvme_iov_md": false 00:19:13.539 }, 00:19:13.539 "memory_domains": [ 00:19:13.539 { 00:19:13.539 "dma_device_id": "system", 00:19:13.539 "dma_device_type": 1 00:19:13.539 }, 00:19:13.539 { 00:19:13.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.539 "dma_device_type": 2 00:19:13.539 } 00:19:13.539 ], 00:19:13.539 "driver_specific": {} 00:19:13.539 }' 00:19:13.539 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.798 17:13:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.798 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.798 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.798 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.798 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.798 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.798 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.056 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:14.056 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.056 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.056 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:14.056 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:14.315 [2024-07-23 17:13:09.568118] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:14.315 [2024-07-23 17:13:09.568144] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:14.315 [2024-07-23 17:13:09.568185] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.315 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.573 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.573 "name": "Existed_Raid", 00:19:14.573 "uuid": "53fb8337-22ff-4b95-a316-a9d16be14a2a", 00:19:14.573 "strip_size_kb": 64, 00:19:14.573 "state": "offline", 00:19:14.573 "raid_level": "concat", 00:19:14.573 "superblock": false, 00:19:14.573 "num_base_bdevs": 3, 00:19:14.573 "num_base_bdevs_discovered": 2, 00:19:14.573 "num_base_bdevs_operational": 2, 00:19:14.573 "base_bdevs_list": [ 00:19:14.573 { 00:19:14.573 "name": null, 00:19:14.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.573 "is_configured": false, 00:19:14.573 "data_offset": 0, 00:19:14.573 "data_size": 65536 00:19:14.573 }, 00:19:14.573 { 00:19:14.573 "name": "BaseBdev2", 00:19:14.573 "uuid": "42601f3d-39d9-41de-a83d-5a9a66d4756d", 00:19:14.573 "is_configured": true, 00:19:14.573 "data_offset": 0, 00:19:14.573 "data_size": 65536 00:19:14.573 }, 00:19:14.573 { 00:19:14.573 "name": "BaseBdev3", 00:19:14.573 "uuid": "24241ae4-a439-4f44-ba2f-a7fe786c216a", 00:19:14.573 "is_configured": true, 00:19:14.573 "data_offset": 0, 00:19:14.573 "data_size": 65536 00:19:14.573 } 00:19:14.573 ] 00:19:14.573 }' 00:19:14.573 17:13:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.573 17:13:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.139 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:15.139 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:15.139 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.139 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:15.396 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:15.396 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:15.396 17:13:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:15.653 [2024-07-23 17:13:10.985915] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:15.653 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:15.653 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:15.653 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.653 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:15.912 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:15.912 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:15.912 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:16.170 [2024-07-23 17:13:11.495720] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:16.170 [2024-07-23 17:13:11.495763] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa8800 name Existed_Raid, state offline 00:19:16.170 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:16.170 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:16.170 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.170 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:16.429 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:16.429 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:16.429 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:19:16.429 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:16.429 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:16.429 17:13:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:16.687 BaseBdev2 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:16.687 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:16.945 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:17.512 [ 00:19:17.512 { 00:19:17.512 "name": "BaseBdev2", 00:19:17.512 "aliases": [ 00:19:17.512 "935fabe0-f13f-49a2-9d51-b175b62d2d36" 00:19:17.512 ], 00:19:17.512 "product_name": "Malloc disk", 00:19:17.512 "block_size": 512, 00:19:17.512 "num_blocks": 65536, 00:19:17.512 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:17.512 "assigned_rate_limits": { 00:19:17.512 "rw_ios_per_sec": 0, 00:19:17.512 "rw_mbytes_per_sec": 0, 00:19:17.512 "r_mbytes_per_sec": 0, 00:19:17.512 "w_mbytes_per_sec": 0 00:19:17.512 }, 00:19:17.512 "claimed": false, 00:19:17.512 "zoned": false, 00:19:17.512 "supported_io_types": { 00:19:17.512 "read": true, 00:19:17.512 "write": true, 00:19:17.512 "unmap": true, 00:19:17.512 "flush": true, 00:19:17.512 "reset": true, 00:19:17.512 "nvme_admin": false, 00:19:17.512 "nvme_io": false, 00:19:17.512 "nvme_io_md": false, 00:19:17.512 "write_zeroes": true, 00:19:17.512 "zcopy": true, 00:19:17.512 "get_zone_info": false, 00:19:17.512 "zone_management": false, 00:19:17.512 "zone_append": false, 00:19:17.512 "compare": false, 00:19:17.512 "compare_and_write": false, 00:19:17.512 "abort": true, 00:19:17.512 "seek_hole": false, 00:19:17.512 "seek_data": false, 00:19:17.512 "copy": true, 00:19:17.512 "nvme_iov_md": false 00:19:17.512 }, 00:19:17.512 "memory_domains": [ 00:19:17.512 { 00:19:17.512 "dma_device_id": "system", 00:19:17.512 "dma_device_type": 1 00:19:17.512 }, 00:19:17.512 { 00:19:17.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.512 "dma_device_type": 2 00:19:17.512 } 00:19:17.512 ], 00:19:17.512 "driver_specific": {} 00:19:17.512 } 00:19:17.512 ] 00:19:17.512 17:13:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:17.512 17:13:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:17.512 17:13:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:17.512 17:13:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:17.771 BaseBdev3 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:17.771 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.058 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:18.316 [ 00:19:18.316 { 00:19:18.316 "name": "BaseBdev3", 00:19:18.316 "aliases": [ 00:19:18.316 "0fab3ade-735b-4f05-8608-85924b6aa6a4" 00:19:18.316 ], 00:19:18.316 "product_name": "Malloc disk", 00:19:18.316 "block_size": 512, 00:19:18.316 "num_blocks": 65536, 00:19:18.316 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:18.316 "assigned_rate_limits": { 00:19:18.316 "rw_ios_per_sec": 0, 00:19:18.316 "rw_mbytes_per_sec": 0, 00:19:18.316 "r_mbytes_per_sec": 0, 00:19:18.316 "w_mbytes_per_sec": 0 00:19:18.316 }, 00:19:18.316 "claimed": false, 00:19:18.316 "zoned": false, 00:19:18.316 "supported_io_types": { 00:19:18.316 "read": true, 00:19:18.316 "write": true, 00:19:18.316 "unmap": true, 00:19:18.316 "flush": true, 00:19:18.316 "reset": true, 00:19:18.316 "nvme_admin": false, 00:19:18.316 "nvme_io": false, 00:19:18.316 "nvme_io_md": false, 00:19:18.316 "write_zeroes": true, 00:19:18.316 "zcopy": true, 00:19:18.316 "get_zone_info": false, 00:19:18.316 "zone_management": false, 00:19:18.316 "zone_append": false, 00:19:18.316 "compare": false, 00:19:18.316 "compare_and_write": false, 00:19:18.316 "abort": true, 00:19:18.316 "seek_hole": false, 00:19:18.316 "seek_data": false, 00:19:18.316 "copy": true, 00:19:18.316 "nvme_iov_md": false 00:19:18.316 }, 00:19:18.316 "memory_domains": [ 00:19:18.316 { 00:19:18.316 "dma_device_id": "system", 00:19:18.316 "dma_device_type": 1 00:19:18.316 }, 00:19:18.316 { 00:19:18.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.316 "dma_device_type": 2 00:19:18.316 } 00:19:18.316 ], 00:19:18.316 "driver_specific": {} 00:19:18.316 } 00:19:18.316 ] 00:19:18.316 17:13:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:18.316 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:18.316 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:18.316 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:18.575 [2024-07-23 17:13:13.835764] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:18.575 [2024-07-23 17:13:13.835809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:18.575 [2024-07-23 17:13:13.835829] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:18.575 [2024-07-23 17:13:13.837166] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.575 17:13:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.142 17:13:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.142 "name": "Existed_Raid", 00:19:19.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.142 "strip_size_kb": 64, 00:19:19.142 "state": "configuring", 00:19:19.142 "raid_level": "concat", 00:19:19.142 "superblock": false, 00:19:19.142 "num_base_bdevs": 3, 00:19:19.142 "num_base_bdevs_discovered": 2, 00:19:19.142 "num_base_bdevs_operational": 3, 00:19:19.142 "base_bdevs_list": [ 00:19:19.142 { 00:19:19.142 "name": "BaseBdev1", 00:19:19.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.142 "is_configured": false, 00:19:19.142 "data_offset": 0, 00:19:19.142 "data_size": 0 00:19:19.142 }, 00:19:19.142 { 00:19:19.142 "name": "BaseBdev2", 00:19:19.142 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:19.142 "is_configured": true, 00:19:19.142 "data_offset": 0, 00:19:19.142 "data_size": 65536 00:19:19.142 }, 00:19:19.142 { 00:19:19.142 "name": "BaseBdev3", 00:19:19.142 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:19.142 "is_configured": true, 00:19:19.142 "data_offset": 0, 00:19:19.142 "data_size": 65536 00:19:19.142 } 00:19:19.142 ] 00:19:19.142 }' 00:19:19.142 17:13:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.142 17:13:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:20.077 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:20.335 [2024-07-23 17:13:15.500163] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.335 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.594 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.594 "name": "Existed_Raid", 00:19:20.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.594 "strip_size_kb": 64, 00:19:20.594 "state": "configuring", 00:19:20.594 "raid_level": "concat", 00:19:20.594 "superblock": false, 00:19:20.594 "num_base_bdevs": 3, 00:19:20.594 "num_base_bdevs_discovered": 1, 00:19:20.594 "num_base_bdevs_operational": 3, 00:19:20.594 "base_bdevs_list": [ 00:19:20.594 { 00:19:20.594 "name": "BaseBdev1", 00:19:20.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.594 "is_configured": false, 00:19:20.594 "data_offset": 0, 00:19:20.594 "data_size": 0 00:19:20.594 }, 00:19:20.594 { 00:19:20.594 "name": null, 00:19:20.594 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:20.594 "is_configured": false, 00:19:20.594 "data_offset": 0, 00:19:20.594 "data_size": 65536 00:19:20.594 }, 00:19:20.594 { 00:19:20.594 "name": "BaseBdev3", 00:19:20.594 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:20.594 "is_configured": true, 00:19:20.594 "data_offset": 0, 00:19:20.594 "data_size": 65536 00:19:20.594 } 00:19:20.594 ] 00:19:20.594 }' 00:19:20.594 17:13:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.594 17:13:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.160 17:13:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.160 17:13:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:21.419 17:13:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:21.419 17:13:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:21.678 [2024-07-23 17:13:17.032694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:21.678 BaseBdev1 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:21.678 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.937 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:22.196 [ 00:19:22.196 { 00:19:22.196 "name": "BaseBdev1", 00:19:22.196 "aliases": [ 00:19:22.196 "65292275-4fd4-4cfc-a8d2-0e89c2a1e257" 00:19:22.196 ], 00:19:22.196 "product_name": "Malloc disk", 00:19:22.196 "block_size": 512, 00:19:22.196 "num_blocks": 65536, 00:19:22.196 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:22.196 "assigned_rate_limits": { 00:19:22.196 "rw_ios_per_sec": 0, 00:19:22.196 "rw_mbytes_per_sec": 0, 00:19:22.196 "r_mbytes_per_sec": 0, 00:19:22.196 "w_mbytes_per_sec": 0 00:19:22.196 }, 00:19:22.196 "claimed": true, 00:19:22.196 "claim_type": "exclusive_write", 00:19:22.196 "zoned": false, 00:19:22.196 "supported_io_types": { 00:19:22.196 "read": true, 00:19:22.196 "write": true, 00:19:22.196 "unmap": true, 00:19:22.196 "flush": true, 00:19:22.196 "reset": true, 00:19:22.196 "nvme_admin": false, 00:19:22.196 "nvme_io": false, 00:19:22.196 "nvme_io_md": false, 00:19:22.196 "write_zeroes": true, 00:19:22.196 "zcopy": true, 00:19:22.196 "get_zone_info": false, 00:19:22.196 "zone_management": false, 00:19:22.196 "zone_append": false, 00:19:22.196 "compare": false, 00:19:22.196 "compare_and_write": false, 00:19:22.196 "abort": true, 00:19:22.196 "seek_hole": false, 00:19:22.196 "seek_data": false, 00:19:22.196 "copy": true, 00:19:22.196 "nvme_iov_md": false 00:19:22.196 }, 00:19:22.196 "memory_domains": [ 00:19:22.196 { 00:19:22.196 "dma_device_id": "system", 00:19:22.196 "dma_device_type": 1 00:19:22.196 }, 00:19:22.196 { 00:19:22.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.196 "dma_device_type": 2 00:19:22.196 } 00:19:22.196 ], 00:19:22.196 "driver_specific": {} 00:19:22.196 } 00:19:22.196 ] 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.196 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.454 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.454 "name": "Existed_Raid", 00:19:22.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.454 "strip_size_kb": 64, 00:19:22.454 "state": "configuring", 00:19:22.454 "raid_level": "concat", 00:19:22.454 "superblock": false, 00:19:22.454 "num_base_bdevs": 3, 00:19:22.454 "num_base_bdevs_discovered": 2, 00:19:22.454 "num_base_bdevs_operational": 3, 00:19:22.454 "base_bdevs_list": [ 00:19:22.454 { 00:19:22.454 "name": "BaseBdev1", 00:19:22.454 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:22.454 "is_configured": true, 00:19:22.454 "data_offset": 0, 00:19:22.454 "data_size": 65536 00:19:22.454 }, 00:19:22.454 { 00:19:22.454 "name": null, 00:19:22.454 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:22.454 "is_configured": false, 00:19:22.454 "data_offset": 0, 00:19:22.454 "data_size": 65536 00:19:22.455 }, 00:19:22.455 { 00:19:22.455 "name": "BaseBdev3", 00:19:22.455 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:22.455 "is_configured": true, 00:19:22.455 "data_offset": 0, 00:19:22.455 "data_size": 65536 00:19:22.455 } 00:19:22.455 ] 00:19:22.455 }' 00:19:22.455 17:13:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.455 17:13:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:23.021 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.021 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:23.279 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:23.279 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:23.538 [2024-07-23 17:13:18.881597] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.538 17:13:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.796 17:13:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.796 "name": "Existed_Raid", 00:19:23.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.796 "strip_size_kb": 64, 00:19:23.796 "state": "configuring", 00:19:23.796 "raid_level": "concat", 00:19:23.796 "superblock": false, 00:19:23.796 "num_base_bdevs": 3, 00:19:23.796 "num_base_bdevs_discovered": 1, 00:19:23.796 "num_base_bdevs_operational": 3, 00:19:23.796 "base_bdevs_list": [ 00:19:23.796 { 00:19:23.796 "name": "BaseBdev1", 00:19:23.796 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:23.796 "is_configured": true, 00:19:23.796 "data_offset": 0, 00:19:23.796 "data_size": 65536 00:19:23.796 }, 00:19:23.796 { 00:19:23.796 "name": null, 00:19:23.796 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:23.796 "is_configured": false, 00:19:23.796 "data_offset": 0, 00:19:23.796 "data_size": 65536 00:19:23.796 }, 00:19:23.796 { 00:19:23.796 "name": null, 00:19:23.796 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:23.796 "is_configured": false, 00:19:23.796 "data_offset": 0, 00:19:23.796 "data_size": 65536 00:19:23.796 } 00:19:23.796 ] 00:19:23.796 }' 00:19:23.796 17:13:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.796 17:13:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:24.730 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.730 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:24.988 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:24.988 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:25.247 [2024-07-23 17:13:20.513936] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.247 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.504 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.504 "name": "Existed_Raid", 00:19:25.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.504 "strip_size_kb": 64, 00:19:25.504 "state": "configuring", 00:19:25.504 "raid_level": "concat", 00:19:25.504 "superblock": false, 00:19:25.504 "num_base_bdevs": 3, 00:19:25.504 "num_base_bdevs_discovered": 2, 00:19:25.504 "num_base_bdevs_operational": 3, 00:19:25.504 "base_bdevs_list": [ 00:19:25.504 { 00:19:25.504 "name": "BaseBdev1", 00:19:25.504 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:25.504 "is_configured": true, 00:19:25.504 "data_offset": 0, 00:19:25.504 "data_size": 65536 00:19:25.504 }, 00:19:25.504 { 00:19:25.504 "name": null, 00:19:25.504 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:25.504 "is_configured": false, 00:19:25.504 "data_offset": 0, 00:19:25.504 "data_size": 65536 00:19:25.504 }, 00:19:25.504 { 00:19:25.504 "name": "BaseBdev3", 00:19:25.504 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:25.504 "is_configured": true, 00:19:25.504 "data_offset": 0, 00:19:25.504 "data_size": 65536 00:19:25.504 } 00:19:25.504 ] 00:19:25.504 }' 00:19:25.504 17:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.504 17:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.439 17:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.439 17:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:26.697 17:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:26.697 17:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:27.263 [2024-07-23 17:13:22.423032] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.263 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.521 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.521 "name": "Existed_Raid", 00:19:27.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.521 "strip_size_kb": 64, 00:19:27.521 "state": "configuring", 00:19:27.521 "raid_level": "concat", 00:19:27.521 "superblock": false, 00:19:27.521 "num_base_bdevs": 3, 00:19:27.521 "num_base_bdevs_discovered": 1, 00:19:27.521 "num_base_bdevs_operational": 3, 00:19:27.521 "base_bdevs_list": [ 00:19:27.521 { 00:19:27.521 "name": null, 00:19:27.521 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:27.521 "is_configured": false, 00:19:27.521 "data_offset": 0, 00:19:27.521 "data_size": 65536 00:19:27.521 }, 00:19:27.521 { 00:19:27.521 "name": null, 00:19:27.521 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:27.521 "is_configured": false, 00:19:27.521 "data_offset": 0, 00:19:27.521 "data_size": 65536 00:19:27.521 }, 00:19:27.521 { 00:19:27.521 "name": "BaseBdev3", 00:19:27.521 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:27.521 "is_configured": true, 00:19:27.521 "data_offset": 0, 00:19:27.521 "data_size": 65536 00:19:27.521 } 00:19:27.521 ] 00:19:27.521 }' 00:19:27.521 17:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.521 17:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.455 17:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.455 17:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:28.455 17:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:28.455 17:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:29.021 [2024-07-23 17:13:24.344526] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.021 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.587 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.587 "name": "Existed_Raid", 00:19:29.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.587 "strip_size_kb": 64, 00:19:29.587 "state": "configuring", 00:19:29.587 "raid_level": "concat", 00:19:29.587 "superblock": false, 00:19:29.587 "num_base_bdevs": 3, 00:19:29.587 "num_base_bdevs_discovered": 2, 00:19:29.587 "num_base_bdevs_operational": 3, 00:19:29.587 "base_bdevs_list": [ 00:19:29.587 { 00:19:29.587 "name": null, 00:19:29.587 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:29.587 "is_configured": false, 00:19:29.587 "data_offset": 0, 00:19:29.587 "data_size": 65536 00:19:29.587 }, 00:19:29.587 { 00:19:29.587 "name": "BaseBdev2", 00:19:29.587 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:29.587 "is_configured": true, 00:19:29.587 "data_offset": 0, 00:19:29.587 "data_size": 65536 00:19:29.587 }, 00:19:29.587 { 00:19:29.587 "name": "BaseBdev3", 00:19:29.587 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:29.587 "is_configured": true, 00:19:29.587 "data_offset": 0, 00:19:29.587 "data_size": 65536 00:19:29.587 } 00:19:29.587 ] 00:19:29.587 }' 00:19:29.587 17:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.587 17:13:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.522 17:13:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.522 17:13:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:30.780 17:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:30.780 17:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.780 17:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:31.039 17:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 65292275-4fd4-4cfc-a8d2-0e89c2a1e257 00:19:31.606 [2024-07-23 17:13:26.751383] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:31.606 [2024-07-23 17:13:26.751426] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fab900 00:19:31.606 [2024-07-23 17:13:26.751435] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:19:31.606 [2024-07-23 17:13:26.751625] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215d9c0 00:19:31.606 [2024-07-23 17:13:26.751739] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fab900 00:19:31.606 [2024-07-23 17:13:26.751748] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fab900 00:19:31.606 [2024-07-23 17:13:26.751918] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:31.606 NewBaseBdev 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:31.606 17:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:32.174 17:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:32.433 [ 00:19:32.433 { 00:19:32.433 "name": "NewBaseBdev", 00:19:32.433 "aliases": [ 00:19:32.433 "65292275-4fd4-4cfc-a8d2-0e89c2a1e257" 00:19:32.433 ], 00:19:32.433 "product_name": "Malloc disk", 00:19:32.433 "block_size": 512, 00:19:32.433 "num_blocks": 65536, 00:19:32.433 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:32.433 "assigned_rate_limits": { 00:19:32.433 "rw_ios_per_sec": 0, 00:19:32.433 "rw_mbytes_per_sec": 0, 00:19:32.433 "r_mbytes_per_sec": 0, 00:19:32.433 "w_mbytes_per_sec": 0 00:19:32.433 }, 00:19:32.433 "claimed": true, 00:19:32.433 "claim_type": "exclusive_write", 00:19:32.433 "zoned": false, 00:19:32.433 "supported_io_types": { 00:19:32.433 "read": true, 00:19:32.433 "write": true, 00:19:32.433 "unmap": true, 00:19:32.433 "flush": true, 00:19:32.433 "reset": true, 00:19:32.433 "nvme_admin": false, 00:19:32.433 "nvme_io": false, 00:19:32.433 "nvme_io_md": false, 00:19:32.433 "write_zeroes": true, 00:19:32.433 "zcopy": true, 00:19:32.433 "get_zone_info": false, 00:19:32.433 "zone_management": false, 00:19:32.433 "zone_append": false, 00:19:32.433 "compare": false, 00:19:32.433 "compare_and_write": false, 00:19:32.433 "abort": true, 00:19:32.433 "seek_hole": false, 00:19:32.433 "seek_data": false, 00:19:32.433 "copy": true, 00:19:32.433 "nvme_iov_md": false 00:19:32.433 }, 00:19:32.433 "memory_domains": [ 00:19:32.433 { 00:19:32.433 "dma_device_id": "system", 00:19:32.433 "dma_device_type": 1 00:19:32.433 }, 00:19:32.433 { 00:19:32.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.433 "dma_device_type": 2 00:19:32.433 } 00:19:32.433 ], 00:19:32.433 "driver_specific": {} 00:19:32.433 } 00:19:32.433 ] 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.433 17:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.001 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.001 "name": "Existed_Raid", 00:19:33.001 "uuid": "551b815d-eebb-4ac9-af9a-d90c1b8ec828", 00:19:33.001 "strip_size_kb": 64, 00:19:33.001 "state": "online", 00:19:33.001 "raid_level": "concat", 00:19:33.001 "superblock": false, 00:19:33.001 "num_base_bdevs": 3, 00:19:33.001 "num_base_bdevs_discovered": 3, 00:19:33.001 "num_base_bdevs_operational": 3, 00:19:33.001 "base_bdevs_list": [ 00:19:33.001 { 00:19:33.001 "name": "NewBaseBdev", 00:19:33.001 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:33.001 "is_configured": true, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 }, 00:19:33.001 { 00:19:33.001 "name": "BaseBdev2", 00:19:33.001 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:33.001 "is_configured": true, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 }, 00:19:33.001 { 00:19:33.001 "name": "BaseBdev3", 00:19:33.001 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:33.001 "is_configured": true, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 } 00:19:33.001 ] 00:19:33.001 }' 00:19:33.001 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.001 17:13:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.633 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:33.633 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:33.633 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:33.633 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:33.634 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:33.634 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:33.634 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:33.634 17:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:33.892 [2024-07-23 17:13:29.138062] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:33.892 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:33.892 "name": "Existed_Raid", 00:19:33.892 "aliases": [ 00:19:33.892 "551b815d-eebb-4ac9-af9a-d90c1b8ec828" 00:19:33.892 ], 00:19:33.892 "product_name": "Raid Volume", 00:19:33.892 "block_size": 512, 00:19:33.892 "num_blocks": 196608, 00:19:33.892 "uuid": "551b815d-eebb-4ac9-af9a-d90c1b8ec828", 00:19:33.892 "assigned_rate_limits": { 00:19:33.892 "rw_ios_per_sec": 0, 00:19:33.892 "rw_mbytes_per_sec": 0, 00:19:33.892 "r_mbytes_per_sec": 0, 00:19:33.892 "w_mbytes_per_sec": 0 00:19:33.892 }, 00:19:33.892 "claimed": false, 00:19:33.892 "zoned": false, 00:19:33.892 "supported_io_types": { 00:19:33.892 "read": true, 00:19:33.892 "write": true, 00:19:33.892 "unmap": true, 00:19:33.892 "flush": true, 00:19:33.892 "reset": true, 00:19:33.892 "nvme_admin": false, 00:19:33.892 "nvme_io": false, 00:19:33.892 "nvme_io_md": false, 00:19:33.892 "write_zeroes": true, 00:19:33.892 "zcopy": false, 00:19:33.892 "get_zone_info": false, 00:19:33.892 "zone_management": false, 00:19:33.892 "zone_append": false, 00:19:33.892 "compare": false, 00:19:33.892 "compare_and_write": false, 00:19:33.892 "abort": false, 00:19:33.892 "seek_hole": false, 00:19:33.892 "seek_data": false, 00:19:33.892 "copy": false, 00:19:33.892 "nvme_iov_md": false 00:19:33.892 }, 00:19:33.892 "memory_domains": [ 00:19:33.892 { 00:19:33.892 "dma_device_id": "system", 00:19:33.892 "dma_device_type": 1 00:19:33.892 }, 00:19:33.892 { 00:19:33.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.893 "dma_device_type": 2 00:19:33.893 }, 00:19:33.893 { 00:19:33.893 "dma_device_id": "system", 00:19:33.893 "dma_device_type": 1 00:19:33.893 }, 00:19:33.893 { 00:19:33.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.893 "dma_device_type": 2 00:19:33.893 }, 00:19:33.893 { 00:19:33.893 "dma_device_id": "system", 00:19:33.893 "dma_device_type": 1 00:19:33.893 }, 00:19:33.893 { 00:19:33.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.893 "dma_device_type": 2 00:19:33.893 } 00:19:33.893 ], 00:19:33.893 "driver_specific": { 00:19:33.893 "raid": { 00:19:33.893 "uuid": "551b815d-eebb-4ac9-af9a-d90c1b8ec828", 00:19:33.893 "strip_size_kb": 64, 00:19:33.893 "state": "online", 00:19:33.893 "raid_level": "concat", 00:19:33.893 "superblock": false, 00:19:33.893 "num_base_bdevs": 3, 00:19:33.893 "num_base_bdevs_discovered": 3, 00:19:33.893 "num_base_bdevs_operational": 3, 00:19:33.893 "base_bdevs_list": [ 00:19:33.893 { 00:19:33.893 "name": "NewBaseBdev", 00:19:33.893 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:33.893 "is_configured": true, 00:19:33.893 "data_offset": 0, 00:19:33.893 "data_size": 65536 00:19:33.893 }, 00:19:33.893 { 00:19:33.893 "name": "BaseBdev2", 00:19:33.893 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:33.893 "is_configured": true, 00:19:33.893 "data_offset": 0, 00:19:33.893 "data_size": 65536 00:19:33.893 }, 00:19:33.893 { 00:19:33.893 "name": "BaseBdev3", 00:19:33.893 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:33.893 "is_configured": true, 00:19:33.893 "data_offset": 0, 00:19:33.893 "data_size": 65536 00:19:33.893 } 00:19:33.893 ] 00:19:33.893 } 00:19:33.893 } 00:19:33.893 }' 00:19:33.893 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:33.893 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:33.893 BaseBdev2 00:19:33.893 BaseBdev3' 00:19:33.893 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:33.893 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:33.893 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:34.461 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:34.461 "name": "NewBaseBdev", 00:19:34.461 "aliases": [ 00:19:34.461 "65292275-4fd4-4cfc-a8d2-0e89c2a1e257" 00:19:34.461 ], 00:19:34.461 "product_name": "Malloc disk", 00:19:34.461 "block_size": 512, 00:19:34.461 "num_blocks": 65536, 00:19:34.461 "uuid": "65292275-4fd4-4cfc-a8d2-0e89c2a1e257", 00:19:34.461 "assigned_rate_limits": { 00:19:34.461 "rw_ios_per_sec": 0, 00:19:34.461 "rw_mbytes_per_sec": 0, 00:19:34.461 "r_mbytes_per_sec": 0, 00:19:34.461 "w_mbytes_per_sec": 0 00:19:34.461 }, 00:19:34.461 "claimed": true, 00:19:34.461 "claim_type": "exclusive_write", 00:19:34.461 "zoned": false, 00:19:34.461 "supported_io_types": { 00:19:34.461 "read": true, 00:19:34.461 "write": true, 00:19:34.461 "unmap": true, 00:19:34.461 "flush": true, 00:19:34.461 "reset": true, 00:19:34.461 "nvme_admin": false, 00:19:34.461 "nvme_io": false, 00:19:34.461 "nvme_io_md": false, 00:19:34.461 "write_zeroes": true, 00:19:34.461 "zcopy": true, 00:19:34.461 "get_zone_info": false, 00:19:34.461 "zone_management": false, 00:19:34.461 "zone_append": false, 00:19:34.461 "compare": false, 00:19:34.461 "compare_and_write": false, 00:19:34.461 "abort": true, 00:19:34.461 "seek_hole": false, 00:19:34.461 "seek_data": false, 00:19:34.461 "copy": true, 00:19:34.461 "nvme_iov_md": false 00:19:34.461 }, 00:19:34.461 "memory_domains": [ 00:19:34.461 { 00:19:34.461 "dma_device_id": "system", 00:19:34.461 "dma_device_type": 1 00:19:34.461 }, 00:19:34.461 { 00:19:34.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.461 "dma_device_type": 2 00:19:34.461 } 00:19:34.461 ], 00:19:34.461 "driver_specific": {} 00:19:34.461 }' 00:19:34.461 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.461 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.461 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:34.461 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:34.461 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:34.720 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:34.720 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:34.720 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:34.720 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:34.720 17:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:34.720 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:34.720 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:34.720 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:34.720 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:34.720 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:34.979 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:34.979 "name": "BaseBdev2", 00:19:34.979 "aliases": [ 00:19:34.979 "935fabe0-f13f-49a2-9d51-b175b62d2d36" 00:19:34.979 ], 00:19:34.979 "product_name": "Malloc disk", 00:19:34.979 "block_size": 512, 00:19:34.979 "num_blocks": 65536, 00:19:34.979 "uuid": "935fabe0-f13f-49a2-9d51-b175b62d2d36", 00:19:34.979 "assigned_rate_limits": { 00:19:34.979 "rw_ios_per_sec": 0, 00:19:34.979 "rw_mbytes_per_sec": 0, 00:19:34.979 "r_mbytes_per_sec": 0, 00:19:34.979 "w_mbytes_per_sec": 0 00:19:34.979 }, 00:19:34.979 "claimed": true, 00:19:34.979 "claim_type": "exclusive_write", 00:19:34.979 "zoned": false, 00:19:34.979 "supported_io_types": { 00:19:34.979 "read": true, 00:19:34.979 "write": true, 00:19:34.979 "unmap": true, 00:19:34.979 "flush": true, 00:19:34.979 "reset": true, 00:19:34.979 "nvme_admin": false, 00:19:34.979 "nvme_io": false, 00:19:34.979 "nvme_io_md": false, 00:19:34.979 "write_zeroes": true, 00:19:34.979 "zcopy": true, 00:19:34.979 "get_zone_info": false, 00:19:34.979 "zone_management": false, 00:19:34.979 "zone_append": false, 00:19:34.979 "compare": false, 00:19:34.979 "compare_and_write": false, 00:19:34.979 "abort": true, 00:19:34.979 "seek_hole": false, 00:19:34.979 "seek_data": false, 00:19:34.979 "copy": true, 00:19:34.979 "nvme_iov_md": false 00:19:34.979 }, 00:19:34.979 "memory_domains": [ 00:19:34.979 { 00:19:34.979 "dma_device_id": "system", 00:19:34.979 "dma_device_type": 1 00:19:34.979 }, 00:19:34.979 { 00:19:34.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.979 "dma_device_type": 2 00:19:34.979 } 00:19:34.979 ], 00:19:34.979 "driver_specific": {} 00:19:34.979 }' 00:19:34.979 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.979 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.979 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:34.979 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:35.238 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:35.497 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:35.497 "name": "BaseBdev3", 00:19:35.497 "aliases": [ 00:19:35.497 "0fab3ade-735b-4f05-8608-85924b6aa6a4" 00:19:35.497 ], 00:19:35.497 "product_name": "Malloc disk", 00:19:35.497 "block_size": 512, 00:19:35.497 "num_blocks": 65536, 00:19:35.497 "uuid": "0fab3ade-735b-4f05-8608-85924b6aa6a4", 00:19:35.497 "assigned_rate_limits": { 00:19:35.497 "rw_ios_per_sec": 0, 00:19:35.497 "rw_mbytes_per_sec": 0, 00:19:35.497 "r_mbytes_per_sec": 0, 00:19:35.497 "w_mbytes_per_sec": 0 00:19:35.497 }, 00:19:35.497 "claimed": true, 00:19:35.497 "claim_type": "exclusive_write", 00:19:35.497 "zoned": false, 00:19:35.497 "supported_io_types": { 00:19:35.497 "read": true, 00:19:35.497 "write": true, 00:19:35.497 "unmap": true, 00:19:35.497 "flush": true, 00:19:35.497 "reset": true, 00:19:35.497 "nvme_admin": false, 00:19:35.497 "nvme_io": false, 00:19:35.497 "nvme_io_md": false, 00:19:35.497 "write_zeroes": true, 00:19:35.497 "zcopy": true, 00:19:35.497 "get_zone_info": false, 00:19:35.497 "zone_management": false, 00:19:35.497 "zone_append": false, 00:19:35.497 "compare": false, 00:19:35.497 "compare_and_write": false, 00:19:35.497 "abort": true, 00:19:35.497 "seek_hole": false, 00:19:35.497 "seek_data": false, 00:19:35.497 "copy": true, 00:19:35.497 "nvme_iov_md": false 00:19:35.497 }, 00:19:35.497 "memory_domains": [ 00:19:35.497 { 00:19:35.497 "dma_device_id": "system", 00:19:35.497 "dma_device_type": 1 00:19:35.497 }, 00:19:35.497 { 00:19:35.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.497 "dma_device_type": 2 00:19:35.497 } 00:19:35.497 ], 00:19:35.497 "driver_specific": {} 00:19:35.497 }' 00:19:35.497 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.755 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.755 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:35.755 17:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.755 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.755 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:35.755 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.755 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.755 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.755 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.014 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.014 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:36.014 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:36.273 [2024-07-23 17:13:31.455884] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:36.273 [2024-07-23 17:13:31.455919] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:36.273 [2024-07-23 17:13:31.455971] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:36.273 [2024-07-23 17:13:31.456020] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:36.273 [2024-07-23 17:13:31.456032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fab900 name Existed_Raid, state offline 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4142699 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4142699 ']' 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4142699 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4142699 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4142699' 00:19:36.273 killing process with pid 4142699 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4142699 00:19:36.273 [2024-07-23 17:13:31.531149] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:36.273 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4142699 00:19:36.273 [2024-07-23 17:13:31.559103] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:36.531 17:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:36.531 00:19:36.531 real 0m33.121s 00:19:36.531 user 1m1.271s 00:19:36.531 sys 0m5.870s 00:19:36.531 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:36.531 17:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.531 ************************************ 00:19:36.531 END TEST raid_state_function_test 00:19:36.531 ************************************ 00:19:36.531 17:13:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:36.531 17:13:31 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:19:36.532 17:13:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:36.532 17:13:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:36.532 17:13:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:36.532 ************************************ 00:19:36.532 START TEST raid_state_function_test_sb 00:19:36.532 ************************************ 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4147678 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4147678' 00:19:36.532 Process raid pid: 4147678 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4147678 /var/tmp/spdk-raid.sock 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4147678 ']' 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:36.532 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:36.532 17:13:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:36.532 [2024-07-23 17:13:31.928568] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:19:36.532 [2024-07-23 17:13:31.928643] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:36.790 [2024-07-23 17:13:32.063495] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:36.790 [2024-07-23 17:13:32.114041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:36.790 [2024-07-23 17:13:32.172290] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:36.790 [2024-07-23 17:13:32.172317] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:37.723 17:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:37.723 17:13:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:37.723 17:13:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:37.723 [2024-07-23 17:13:33.038175] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:37.723 [2024-07-23 17:13:33.038216] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:37.723 [2024-07-23 17:13:33.038227] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:37.723 [2024-07-23 17:13:33.038238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:37.724 [2024-07-23 17:13:33.038247] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:37.724 [2024-07-23 17:13:33.038258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.724 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.982 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.982 "name": "Existed_Raid", 00:19:37.982 "uuid": "ae3364c5-7efb-4019-a71d-d692f2a4f0d6", 00:19:37.982 "strip_size_kb": 64, 00:19:37.982 "state": "configuring", 00:19:37.982 "raid_level": "concat", 00:19:37.982 "superblock": true, 00:19:37.982 "num_base_bdevs": 3, 00:19:37.982 "num_base_bdevs_discovered": 0, 00:19:37.982 "num_base_bdevs_operational": 3, 00:19:37.982 "base_bdevs_list": [ 00:19:37.982 { 00:19:37.982 "name": "BaseBdev1", 00:19:37.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.982 "is_configured": false, 00:19:37.982 "data_offset": 0, 00:19:37.982 "data_size": 0 00:19:37.982 }, 00:19:37.982 { 00:19:37.982 "name": "BaseBdev2", 00:19:37.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.982 "is_configured": false, 00:19:37.982 "data_offset": 0, 00:19:37.982 "data_size": 0 00:19:37.982 }, 00:19:37.982 { 00:19:37.982 "name": "BaseBdev3", 00:19:37.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.982 "is_configured": false, 00:19:37.982 "data_offset": 0, 00:19:37.982 "data_size": 0 00:19:37.982 } 00:19:37.982 ] 00:19:37.982 }' 00:19:37.982 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.982 17:13:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.547 17:13:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:38.805 [2024-07-23 17:13:34.173023] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:38.805 [2024-07-23 17:13:34.173059] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ff0280 name Existed_Raid, state configuring 00:19:38.805 17:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:39.064 [2024-07-23 17:13:34.417700] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:39.064 [2024-07-23 17:13:34.417734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:39.064 [2024-07-23 17:13:34.417744] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:39.064 [2024-07-23 17:13:34.417755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:39.064 [2024-07-23 17:13:34.417764] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:39.064 [2024-07-23 17:13:34.417775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:39.064 17:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:39.322 [2024-07-23 17:13:34.673435] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:39.322 BaseBdev1 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:39.322 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:39.580 17:13:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:39.838 [ 00:19:39.838 { 00:19:39.838 "name": "BaseBdev1", 00:19:39.838 "aliases": [ 00:19:39.838 "af48a377-3c93-468b-9d20-206e662d0977" 00:19:39.838 ], 00:19:39.838 "product_name": "Malloc disk", 00:19:39.838 "block_size": 512, 00:19:39.838 "num_blocks": 65536, 00:19:39.838 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:39.838 "assigned_rate_limits": { 00:19:39.838 "rw_ios_per_sec": 0, 00:19:39.838 "rw_mbytes_per_sec": 0, 00:19:39.838 "r_mbytes_per_sec": 0, 00:19:39.838 "w_mbytes_per_sec": 0 00:19:39.838 }, 00:19:39.838 "claimed": true, 00:19:39.838 "claim_type": "exclusive_write", 00:19:39.838 "zoned": false, 00:19:39.838 "supported_io_types": { 00:19:39.838 "read": true, 00:19:39.838 "write": true, 00:19:39.838 "unmap": true, 00:19:39.838 "flush": true, 00:19:39.838 "reset": true, 00:19:39.838 "nvme_admin": false, 00:19:39.838 "nvme_io": false, 00:19:39.838 "nvme_io_md": false, 00:19:39.838 "write_zeroes": true, 00:19:39.838 "zcopy": true, 00:19:39.838 "get_zone_info": false, 00:19:39.838 "zone_management": false, 00:19:39.838 "zone_append": false, 00:19:39.838 "compare": false, 00:19:39.838 "compare_and_write": false, 00:19:39.838 "abort": true, 00:19:39.838 "seek_hole": false, 00:19:39.838 "seek_data": false, 00:19:39.838 "copy": true, 00:19:39.838 "nvme_iov_md": false 00:19:39.838 }, 00:19:39.838 "memory_domains": [ 00:19:39.838 { 00:19:39.838 "dma_device_id": "system", 00:19:39.838 "dma_device_type": 1 00:19:39.838 }, 00:19:39.838 { 00:19:39.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.838 "dma_device_type": 2 00:19:39.838 } 00:19:39.838 ], 00:19:39.838 "driver_specific": {} 00:19:39.838 } 00:19:39.838 ] 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.838 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:40.096 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.096 "name": "Existed_Raid", 00:19:40.096 "uuid": "355b5c50-4cff-481b-a0eb-51a39a8d3bbb", 00:19:40.096 "strip_size_kb": 64, 00:19:40.096 "state": "configuring", 00:19:40.096 "raid_level": "concat", 00:19:40.096 "superblock": true, 00:19:40.096 "num_base_bdevs": 3, 00:19:40.096 "num_base_bdevs_discovered": 1, 00:19:40.096 "num_base_bdevs_operational": 3, 00:19:40.096 "base_bdevs_list": [ 00:19:40.096 { 00:19:40.096 "name": "BaseBdev1", 00:19:40.096 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:40.096 "is_configured": true, 00:19:40.096 "data_offset": 2048, 00:19:40.096 "data_size": 63488 00:19:40.096 }, 00:19:40.096 { 00:19:40.096 "name": "BaseBdev2", 00:19:40.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.096 "is_configured": false, 00:19:40.096 "data_offset": 0, 00:19:40.096 "data_size": 0 00:19:40.096 }, 00:19:40.096 { 00:19:40.096 "name": "BaseBdev3", 00:19:40.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.096 "is_configured": false, 00:19:40.096 "data_offset": 0, 00:19:40.096 "data_size": 0 00:19:40.096 } 00:19:40.096 ] 00:19:40.096 }' 00:19:40.096 17:13:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.096 17:13:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:40.662 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:40.919 [2024-07-23 17:13:36.233571] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:40.919 [2024-07-23 17:13:36.233614] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fefbb0 name Existed_Raid, state configuring 00:19:40.919 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:41.177 [2024-07-23 17:13:36.482280] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:41.177 [2024-07-23 17:13:36.483697] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:41.177 [2024-07-23 17:13:36.483729] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:41.177 [2024-07-23 17:13:36.483739] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:41.177 [2024-07-23 17:13:36.483750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.177 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.434 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.434 "name": "Existed_Raid", 00:19:41.434 "uuid": "e97600aa-f2ce-413f-b34c-86ce3aa2b946", 00:19:41.434 "strip_size_kb": 64, 00:19:41.434 "state": "configuring", 00:19:41.434 "raid_level": "concat", 00:19:41.434 "superblock": true, 00:19:41.434 "num_base_bdevs": 3, 00:19:41.434 "num_base_bdevs_discovered": 1, 00:19:41.434 "num_base_bdevs_operational": 3, 00:19:41.434 "base_bdevs_list": [ 00:19:41.434 { 00:19:41.434 "name": "BaseBdev1", 00:19:41.434 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:41.434 "is_configured": true, 00:19:41.434 "data_offset": 2048, 00:19:41.434 "data_size": 63488 00:19:41.434 }, 00:19:41.434 { 00:19:41.434 "name": "BaseBdev2", 00:19:41.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.434 "is_configured": false, 00:19:41.434 "data_offset": 0, 00:19:41.434 "data_size": 0 00:19:41.434 }, 00:19:41.434 { 00:19:41.434 "name": "BaseBdev3", 00:19:41.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.434 "is_configured": false, 00:19:41.434 "data_offset": 0, 00:19:41.434 "data_size": 0 00:19:41.434 } 00:19:41.434 ] 00:19:41.434 }' 00:19:41.434 17:13:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.434 17:13:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:42.000 17:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:42.259 [2024-07-23 17:13:37.600654] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:42.259 BaseBdev2 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:42.259 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:42.518 17:13:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:42.776 [ 00:19:42.776 { 00:19:42.776 "name": "BaseBdev2", 00:19:42.776 "aliases": [ 00:19:42.776 "d02708c5-f249-475a-8a0c-6d5a348b4ebd" 00:19:42.776 ], 00:19:42.776 "product_name": "Malloc disk", 00:19:42.776 "block_size": 512, 00:19:42.776 "num_blocks": 65536, 00:19:42.776 "uuid": "d02708c5-f249-475a-8a0c-6d5a348b4ebd", 00:19:42.776 "assigned_rate_limits": { 00:19:42.776 "rw_ios_per_sec": 0, 00:19:42.776 "rw_mbytes_per_sec": 0, 00:19:42.776 "r_mbytes_per_sec": 0, 00:19:42.776 "w_mbytes_per_sec": 0 00:19:42.776 }, 00:19:42.776 "claimed": true, 00:19:42.776 "claim_type": "exclusive_write", 00:19:42.776 "zoned": false, 00:19:42.776 "supported_io_types": { 00:19:42.776 "read": true, 00:19:42.776 "write": true, 00:19:42.776 "unmap": true, 00:19:42.776 "flush": true, 00:19:42.776 "reset": true, 00:19:42.776 "nvme_admin": false, 00:19:42.776 "nvme_io": false, 00:19:42.776 "nvme_io_md": false, 00:19:42.776 "write_zeroes": true, 00:19:42.776 "zcopy": true, 00:19:42.776 "get_zone_info": false, 00:19:42.776 "zone_management": false, 00:19:42.776 "zone_append": false, 00:19:42.776 "compare": false, 00:19:42.776 "compare_and_write": false, 00:19:42.776 "abort": true, 00:19:42.776 "seek_hole": false, 00:19:42.776 "seek_data": false, 00:19:42.776 "copy": true, 00:19:42.776 "nvme_iov_md": false 00:19:42.776 }, 00:19:42.776 "memory_domains": [ 00:19:42.776 { 00:19:42.776 "dma_device_id": "system", 00:19:42.776 "dma_device_type": 1 00:19:42.776 }, 00:19:42.776 { 00:19:42.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.776 "dma_device_type": 2 00:19:42.776 } 00:19:42.776 ], 00:19:42.776 "driver_specific": {} 00:19:42.776 } 00:19:42.776 ] 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.776 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:43.035 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.035 "name": "Existed_Raid", 00:19:43.035 "uuid": "e97600aa-f2ce-413f-b34c-86ce3aa2b946", 00:19:43.035 "strip_size_kb": 64, 00:19:43.035 "state": "configuring", 00:19:43.035 "raid_level": "concat", 00:19:43.035 "superblock": true, 00:19:43.035 "num_base_bdevs": 3, 00:19:43.035 "num_base_bdevs_discovered": 2, 00:19:43.035 "num_base_bdevs_operational": 3, 00:19:43.035 "base_bdevs_list": [ 00:19:43.035 { 00:19:43.035 "name": "BaseBdev1", 00:19:43.035 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:43.035 "is_configured": true, 00:19:43.035 "data_offset": 2048, 00:19:43.035 "data_size": 63488 00:19:43.035 }, 00:19:43.035 { 00:19:43.035 "name": "BaseBdev2", 00:19:43.035 "uuid": "d02708c5-f249-475a-8a0c-6d5a348b4ebd", 00:19:43.035 "is_configured": true, 00:19:43.035 "data_offset": 2048, 00:19:43.035 "data_size": 63488 00:19:43.035 }, 00:19:43.035 { 00:19:43.035 "name": "BaseBdev3", 00:19:43.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.035 "is_configured": false, 00:19:43.035 "data_offset": 0, 00:19:43.035 "data_size": 0 00:19:43.035 } 00:19:43.035 ] 00:19:43.035 }' 00:19:43.035 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.035 17:13:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:43.600 17:13:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:43.858 [2024-07-23 17:13:39.212391] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:43.858 [2024-07-23 17:13:39.212560] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fef800 00:19:43.858 [2024-07-23 17:13:39.212574] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:19:43.858 [2024-07-23 17:13:39.212743] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff3b50 00:19:43.858 [2024-07-23 17:13:39.212865] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fef800 00:19:43.858 [2024-07-23 17:13:39.212875] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fef800 00:19:43.858 [2024-07-23 17:13:39.212979] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:43.858 BaseBdev3 00:19:43.858 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:43.858 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:43.859 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:43.859 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:43.859 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:43.859 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:43.859 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:44.117 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:44.375 [ 00:19:44.375 { 00:19:44.375 "name": "BaseBdev3", 00:19:44.375 "aliases": [ 00:19:44.375 "005caf76-7253-4377-970d-a513bbc5c470" 00:19:44.375 ], 00:19:44.375 "product_name": "Malloc disk", 00:19:44.375 "block_size": 512, 00:19:44.375 "num_blocks": 65536, 00:19:44.375 "uuid": "005caf76-7253-4377-970d-a513bbc5c470", 00:19:44.375 "assigned_rate_limits": { 00:19:44.375 "rw_ios_per_sec": 0, 00:19:44.375 "rw_mbytes_per_sec": 0, 00:19:44.375 "r_mbytes_per_sec": 0, 00:19:44.375 "w_mbytes_per_sec": 0 00:19:44.375 }, 00:19:44.375 "claimed": true, 00:19:44.375 "claim_type": "exclusive_write", 00:19:44.375 "zoned": false, 00:19:44.375 "supported_io_types": { 00:19:44.375 "read": true, 00:19:44.375 "write": true, 00:19:44.375 "unmap": true, 00:19:44.375 "flush": true, 00:19:44.375 "reset": true, 00:19:44.375 "nvme_admin": false, 00:19:44.375 "nvme_io": false, 00:19:44.375 "nvme_io_md": false, 00:19:44.375 "write_zeroes": true, 00:19:44.375 "zcopy": true, 00:19:44.375 "get_zone_info": false, 00:19:44.375 "zone_management": false, 00:19:44.375 "zone_append": false, 00:19:44.375 "compare": false, 00:19:44.375 "compare_and_write": false, 00:19:44.375 "abort": true, 00:19:44.375 "seek_hole": false, 00:19:44.375 "seek_data": false, 00:19:44.375 "copy": true, 00:19:44.375 "nvme_iov_md": false 00:19:44.375 }, 00:19:44.375 "memory_domains": [ 00:19:44.375 { 00:19:44.375 "dma_device_id": "system", 00:19:44.375 "dma_device_type": 1 00:19:44.375 }, 00:19:44.375 { 00:19:44.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.375 "dma_device_type": 2 00:19:44.375 } 00:19:44.375 ], 00:19:44.375 "driver_specific": {} 00:19:44.375 } 00:19:44.375 ] 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.375 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.634 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.634 "name": "Existed_Raid", 00:19:44.634 "uuid": "e97600aa-f2ce-413f-b34c-86ce3aa2b946", 00:19:44.634 "strip_size_kb": 64, 00:19:44.634 "state": "online", 00:19:44.634 "raid_level": "concat", 00:19:44.634 "superblock": true, 00:19:44.634 "num_base_bdevs": 3, 00:19:44.634 "num_base_bdevs_discovered": 3, 00:19:44.634 "num_base_bdevs_operational": 3, 00:19:44.634 "base_bdevs_list": [ 00:19:44.634 { 00:19:44.634 "name": "BaseBdev1", 00:19:44.634 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:44.634 "is_configured": true, 00:19:44.634 "data_offset": 2048, 00:19:44.634 "data_size": 63488 00:19:44.634 }, 00:19:44.634 { 00:19:44.634 "name": "BaseBdev2", 00:19:44.634 "uuid": "d02708c5-f249-475a-8a0c-6d5a348b4ebd", 00:19:44.634 "is_configured": true, 00:19:44.634 "data_offset": 2048, 00:19:44.634 "data_size": 63488 00:19:44.634 }, 00:19:44.634 { 00:19:44.634 "name": "BaseBdev3", 00:19:44.634 "uuid": "005caf76-7253-4377-970d-a513bbc5c470", 00:19:44.634 "is_configured": true, 00:19:44.634 "data_offset": 2048, 00:19:44.634 "data_size": 63488 00:19:44.634 } 00:19:44.634 ] 00:19:44.634 }' 00:19:44.634 17:13:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.634 17:13:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:45.201 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:45.459 [2024-07-23 17:13:40.796910] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.459 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:45.459 "name": "Existed_Raid", 00:19:45.459 "aliases": [ 00:19:45.459 "e97600aa-f2ce-413f-b34c-86ce3aa2b946" 00:19:45.459 ], 00:19:45.459 "product_name": "Raid Volume", 00:19:45.459 "block_size": 512, 00:19:45.459 "num_blocks": 190464, 00:19:45.459 "uuid": "e97600aa-f2ce-413f-b34c-86ce3aa2b946", 00:19:45.459 "assigned_rate_limits": { 00:19:45.459 "rw_ios_per_sec": 0, 00:19:45.459 "rw_mbytes_per_sec": 0, 00:19:45.459 "r_mbytes_per_sec": 0, 00:19:45.459 "w_mbytes_per_sec": 0 00:19:45.459 }, 00:19:45.459 "claimed": false, 00:19:45.459 "zoned": false, 00:19:45.459 "supported_io_types": { 00:19:45.459 "read": true, 00:19:45.459 "write": true, 00:19:45.459 "unmap": true, 00:19:45.459 "flush": true, 00:19:45.459 "reset": true, 00:19:45.459 "nvme_admin": false, 00:19:45.459 "nvme_io": false, 00:19:45.459 "nvme_io_md": false, 00:19:45.459 "write_zeroes": true, 00:19:45.459 "zcopy": false, 00:19:45.459 "get_zone_info": false, 00:19:45.459 "zone_management": false, 00:19:45.459 "zone_append": false, 00:19:45.459 "compare": false, 00:19:45.459 "compare_and_write": false, 00:19:45.459 "abort": false, 00:19:45.459 "seek_hole": false, 00:19:45.459 "seek_data": false, 00:19:45.459 "copy": false, 00:19:45.459 "nvme_iov_md": false 00:19:45.459 }, 00:19:45.459 "memory_domains": [ 00:19:45.459 { 00:19:45.460 "dma_device_id": "system", 00:19:45.460 "dma_device_type": 1 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.460 "dma_device_type": 2 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "dma_device_id": "system", 00:19:45.460 "dma_device_type": 1 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.460 "dma_device_type": 2 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "dma_device_id": "system", 00:19:45.460 "dma_device_type": 1 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.460 "dma_device_type": 2 00:19:45.460 } 00:19:45.460 ], 00:19:45.460 "driver_specific": { 00:19:45.460 "raid": { 00:19:45.460 "uuid": "e97600aa-f2ce-413f-b34c-86ce3aa2b946", 00:19:45.460 "strip_size_kb": 64, 00:19:45.460 "state": "online", 00:19:45.460 "raid_level": "concat", 00:19:45.460 "superblock": true, 00:19:45.460 "num_base_bdevs": 3, 00:19:45.460 "num_base_bdevs_discovered": 3, 00:19:45.460 "num_base_bdevs_operational": 3, 00:19:45.460 "base_bdevs_list": [ 00:19:45.460 { 00:19:45.460 "name": "BaseBdev1", 00:19:45.460 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:45.460 "is_configured": true, 00:19:45.460 "data_offset": 2048, 00:19:45.460 "data_size": 63488 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "name": "BaseBdev2", 00:19:45.460 "uuid": "d02708c5-f249-475a-8a0c-6d5a348b4ebd", 00:19:45.460 "is_configured": true, 00:19:45.460 "data_offset": 2048, 00:19:45.460 "data_size": 63488 00:19:45.460 }, 00:19:45.460 { 00:19:45.460 "name": "BaseBdev3", 00:19:45.460 "uuid": "005caf76-7253-4377-970d-a513bbc5c470", 00:19:45.460 "is_configured": true, 00:19:45.460 "data_offset": 2048, 00:19:45.460 "data_size": 63488 00:19:45.460 } 00:19:45.460 ] 00:19:45.460 } 00:19:45.460 } 00:19:45.460 }' 00:19:45.460 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:45.460 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:45.460 BaseBdev2 00:19:45.460 BaseBdev3' 00:19:45.460 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:45.460 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:45.460 17:13:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:45.718 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:45.718 "name": "BaseBdev1", 00:19:45.718 "aliases": [ 00:19:45.718 "af48a377-3c93-468b-9d20-206e662d0977" 00:19:45.718 ], 00:19:45.718 "product_name": "Malloc disk", 00:19:45.718 "block_size": 512, 00:19:45.718 "num_blocks": 65536, 00:19:45.718 "uuid": "af48a377-3c93-468b-9d20-206e662d0977", 00:19:45.718 "assigned_rate_limits": { 00:19:45.718 "rw_ios_per_sec": 0, 00:19:45.718 "rw_mbytes_per_sec": 0, 00:19:45.718 "r_mbytes_per_sec": 0, 00:19:45.718 "w_mbytes_per_sec": 0 00:19:45.718 }, 00:19:45.718 "claimed": true, 00:19:45.718 "claim_type": "exclusive_write", 00:19:45.718 "zoned": false, 00:19:45.718 "supported_io_types": { 00:19:45.718 "read": true, 00:19:45.718 "write": true, 00:19:45.718 "unmap": true, 00:19:45.718 "flush": true, 00:19:45.718 "reset": true, 00:19:45.718 "nvme_admin": false, 00:19:45.718 "nvme_io": false, 00:19:45.718 "nvme_io_md": false, 00:19:45.718 "write_zeroes": true, 00:19:45.718 "zcopy": true, 00:19:45.718 "get_zone_info": false, 00:19:45.718 "zone_management": false, 00:19:45.718 "zone_append": false, 00:19:45.718 "compare": false, 00:19:45.718 "compare_and_write": false, 00:19:45.718 "abort": true, 00:19:45.718 "seek_hole": false, 00:19:45.718 "seek_data": false, 00:19:45.718 "copy": true, 00:19:45.718 "nvme_iov_md": false 00:19:45.718 }, 00:19:45.718 "memory_domains": [ 00:19:45.718 { 00:19:45.718 "dma_device_id": "system", 00:19:45.718 "dma_device_type": 1 00:19:45.718 }, 00:19:45.718 { 00:19:45.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:45.718 "dma_device_type": 2 00:19:45.718 } 00:19:45.718 ], 00:19:45.718 "driver_specific": {} 00:19:45.718 }' 00:19:45.718 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:45.976 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.234 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.234 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:46.234 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:46.234 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:46.235 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:46.493 "name": "BaseBdev2", 00:19:46.493 "aliases": [ 00:19:46.493 "d02708c5-f249-475a-8a0c-6d5a348b4ebd" 00:19:46.493 ], 00:19:46.493 "product_name": "Malloc disk", 00:19:46.493 "block_size": 512, 00:19:46.493 "num_blocks": 65536, 00:19:46.493 "uuid": "d02708c5-f249-475a-8a0c-6d5a348b4ebd", 00:19:46.493 "assigned_rate_limits": { 00:19:46.493 "rw_ios_per_sec": 0, 00:19:46.493 "rw_mbytes_per_sec": 0, 00:19:46.493 "r_mbytes_per_sec": 0, 00:19:46.493 "w_mbytes_per_sec": 0 00:19:46.493 }, 00:19:46.493 "claimed": true, 00:19:46.493 "claim_type": "exclusive_write", 00:19:46.493 "zoned": false, 00:19:46.493 "supported_io_types": { 00:19:46.493 "read": true, 00:19:46.493 "write": true, 00:19:46.493 "unmap": true, 00:19:46.493 "flush": true, 00:19:46.493 "reset": true, 00:19:46.493 "nvme_admin": false, 00:19:46.493 "nvme_io": false, 00:19:46.493 "nvme_io_md": false, 00:19:46.493 "write_zeroes": true, 00:19:46.493 "zcopy": true, 00:19:46.493 "get_zone_info": false, 00:19:46.493 "zone_management": false, 00:19:46.493 "zone_append": false, 00:19:46.493 "compare": false, 00:19:46.493 "compare_and_write": false, 00:19:46.493 "abort": true, 00:19:46.493 "seek_hole": false, 00:19:46.493 "seek_data": false, 00:19:46.493 "copy": true, 00:19:46.493 "nvme_iov_md": false 00:19:46.493 }, 00:19:46.493 "memory_domains": [ 00:19:46.493 { 00:19:46.493 "dma_device_id": "system", 00:19:46.493 "dma_device_type": 1 00:19:46.493 }, 00:19:46.493 { 00:19:46.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.493 "dma_device_type": 2 00:19:46.493 } 00:19:46.493 ], 00:19:46.493 "driver_specific": {} 00:19:46.493 }' 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:46.493 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.752 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:46.752 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:46.752 17:13:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.752 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:46.752 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:46.752 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:46.752 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:46.752 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.010 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.010 "name": "BaseBdev3", 00:19:47.010 "aliases": [ 00:19:47.010 "005caf76-7253-4377-970d-a513bbc5c470" 00:19:47.010 ], 00:19:47.010 "product_name": "Malloc disk", 00:19:47.010 "block_size": 512, 00:19:47.010 "num_blocks": 65536, 00:19:47.010 "uuid": "005caf76-7253-4377-970d-a513bbc5c470", 00:19:47.010 "assigned_rate_limits": { 00:19:47.010 "rw_ios_per_sec": 0, 00:19:47.010 "rw_mbytes_per_sec": 0, 00:19:47.010 "r_mbytes_per_sec": 0, 00:19:47.010 "w_mbytes_per_sec": 0 00:19:47.010 }, 00:19:47.010 "claimed": true, 00:19:47.010 "claim_type": "exclusive_write", 00:19:47.010 "zoned": false, 00:19:47.010 "supported_io_types": { 00:19:47.010 "read": true, 00:19:47.010 "write": true, 00:19:47.010 "unmap": true, 00:19:47.010 "flush": true, 00:19:47.010 "reset": true, 00:19:47.010 "nvme_admin": false, 00:19:47.010 "nvme_io": false, 00:19:47.010 "nvme_io_md": false, 00:19:47.010 "write_zeroes": true, 00:19:47.010 "zcopy": true, 00:19:47.010 "get_zone_info": false, 00:19:47.010 "zone_management": false, 00:19:47.010 "zone_append": false, 00:19:47.010 "compare": false, 00:19:47.010 "compare_and_write": false, 00:19:47.010 "abort": true, 00:19:47.010 "seek_hole": false, 00:19:47.010 "seek_data": false, 00:19:47.010 "copy": true, 00:19:47.010 "nvme_iov_md": false 00:19:47.010 }, 00:19:47.010 "memory_domains": [ 00:19:47.010 { 00:19:47.010 "dma_device_id": "system", 00:19:47.010 "dma_device_type": 1 00:19:47.010 }, 00:19:47.010 { 00:19:47.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.010 "dma_device_type": 2 00:19:47.010 } 00:19:47.010 ], 00:19:47.010 "driver_specific": {} 00:19:47.010 }' 00:19:47.010 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.010 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.010 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.010 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.271 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:47.551 [2024-07-23 17:13:42.886215] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:47.551 [2024-07-23 17:13:42.886243] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:47.551 [2024-07-23 17:13:42.886284] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:47.551 17:13:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.822 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.822 "name": "Existed_Raid", 00:19:47.822 "uuid": "e97600aa-f2ce-413f-b34c-86ce3aa2b946", 00:19:47.822 "strip_size_kb": 64, 00:19:47.822 "state": "offline", 00:19:47.822 "raid_level": "concat", 00:19:47.822 "superblock": true, 00:19:47.822 "num_base_bdevs": 3, 00:19:47.822 "num_base_bdevs_discovered": 2, 00:19:47.822 "num_base_bdevs_operational": 2, 00:19:47.822 "base_bdevs_list": [ 00:19:47.822 { 00:19:47.822 "name": null, 00:19:47.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.822 "is_configured": false, 00:19:47.822 "data_offset": 2048, 00:19:47.822 "data_size": 63488 00:19:47.822 }, 00:19:47.822 { 00:19:47.822 "name": "BaseBdev2", 00:19:47.822 "uuid": "d02708c5-f249-475a-8a0c-6d5a348b4ebd", 00:19:47.822 "is_configured": true, 00:19:47.822 "data_offset": 2048, 00:19:47.822 "data_size": 63488 00:19:47.822 }, 00:19:47.822 { 00:19:47.822 "name": "BaseBdev3", 00:19:47.822 "uuid": "005caf76-7253-4377-970d-a513bbc5c470", 00:19:47.822 "is_configured": true, 00:19:47.822 "data_offset": 2048, 00:19:47.822 "data_size": 63488 00:19:47.822 } 00:19:47.822 ] 00:19:47.822 }' 00:19:47.822 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.822 17:13:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:48.388 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:48.388 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:48.388 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:48.388 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.646 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:48.646 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:48.646 17:13:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:48.905 [2024-07-23 17:13:44.154584] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:48.905 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:48.905 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:48.905 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.905 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:49.163 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:49.163 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:49.163 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:49.422 [2024-07-23 17:13:44.658498] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:49.422 [2024-07-23 17:13:44.658541] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fef800 name Existed_Raid, state offline 00:19:49.422 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:49.422 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:49.422 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.422 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:49.680 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:49.680 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:49.680 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:19:49.680 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:49.680 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:49.680 17:13:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:49.939 BaseBdev2 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:49.939 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:50.197 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:50.456 [ 00:19:50.456 { 00:19:50.456 "name": "BaseBdev2", 00:19:50.456 "aliases": [ 00:19:50.456 "d99d6e82-5ecc-487f-8878-6b00b57a26a3" 00:19:50.456 ], 00:19:50.456 "product_name": "Malloc disk", 00:19:50.456 "block_size": 512, 00:19:50.456 "num_blocks": 65536, 00:19:50.456 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:50.456 "assigned_rate_limits": { 00:19:50.456 "rw_ios_per_sec": 0, 00:19:50.456 "rw_mbytes_per_sec": 0, 00:19:50.456 "r_mbytes_per_sec": 0, 00:19:50.456 "w_mbytes_per_sec": 0 00:19:50.456 }, 00:19:50.456 "claimed": false, 00:19:50.456 "zoned": false, 00:19:50.456 "supported_io_types": { 00:19:50.456 "read": true, 00:19:50.456 "write": true, 00:19:50.456 "unmap": true, 00:19:50.456 "flush": true, 00:19:50.456 "reset": true, 00:19:50.456 "nvme_admin": false, 00:19:50.456 "nvme_io": false, 00:19:50.456 "nvme_io_md": false, 00:19:50.456 "write_zeroes": true, 00:19:50.456 "zcopy": true, 00:19:50.456 "get_zone_info": false, 00:19:50.456 "zone_management": false, 00:19:50.456 "zone_append": false, 00:19:50.456 "compare": false, 00:19:50.456 "compare_and_write": false, 00:19:50.456 "abort": true, 00:19:50.456 "seek_hole": false, 00:19:50.456 "seek_data": false, 00:19:50.456 "copy": true, 00:19:50.456 "nvme_iov_md": false 00:19:50.456 }, 00:19:50.456 "memory_domains": [ 00:19:50.456 { 00:19:50.456 "dma_device_id": "system", 00:19:50.456 "dma_device_type": 1 00:19:50.456 }, 00:19:50.456 { 00:19:50.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.456 "dma_device_type": 2 00:19:50.456 } 00:19:50.456 ], 00:19:50.456 "driver_specific": {} 00:19:50.456 } 00:19:50.456 ] 00:19:50.456 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:50.456 17:13:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:50.456 17:13:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:50.456 17:13:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:50.715 BaseBdev3 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:50.715 17:13:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:50.973 17:13:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:50.973 [ 00:19:50.973 { 00:19:50.973 "name": "BaseBdev3", 00:19:50.973 "aliases": [ 00:19:50.973 "b484ba04-653e-44c6-8574-71e2f37ead55" 00:19:50.973 ], 00:19:50.973 "product_name": "Malloc disk", 00:19:50.973 "block_size": 512, 00:19:50.973 "num_blocks": 65536, 00:19:50.973 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:50.973 "assigned_rate_limits": { 00:19:50.973 "rw_ios_per_sec": 0, 00:19:50.973 "rw_mbytes_per_sec": 0, 00:19:50.973 "r_mbytes_per_sec": 0, 00:19:50.973 "w_mbytes_per_sec": 0 00:19:50.973 }, 00:19:50.973 "claimed": false, 00:19:50.973 "zoned": false, 00:19:50.973 "supported_io_types": { 00:19:50.973 "read": true, 00:19:50.973 "write": true, 00:19:50.973 "unmap": true, 00:19:50.973 "flush": true, 00:19:50.973 "reset": true, 00:19:50.973 "nvme_admin": false, 00:19:50.973 "nvme_io": false, 00:19:50.973 "nvme_io_md": false, 00:19:50.973 "write_zeroes": true, 00:19:50.973 "zcopy": true, 00:19:50.973 "get_zone_info": false, 00:19:50.973 "zone_management": false, 00:19:50.973 "zone_append": false, 00:19:50.973 "compare": false, 00:19:50.973 "compare_and_write": false, 00:19:50.973 "abort": true, 00:19:50.973 "seek_hole": false, 00:19:50.973 "seek_data": false, 00:19:50.973 "copy": true, 00:19:50.973 "nvme_iov_md": false 00:19:50.973 }, 00:19:50.973 "memory_domains": [ 00:19:50.973 { 00:19:50.973 "dma_device_id": "system", 00:19:50.973 "dma_device_type": 1 00:19:50.973 }, 00:19:50.973 { 00:19:50.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.973 "dma_device_type": 2 00:19:50.973 } 00:19:50.973 ], 00:19:50.973 "driver_specific": {} 00:19:50.973 } 00:19:50.973 ] 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:19:51.233 [2024-07-23 17:13:46.632966] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:51.233 [2024-07-23 17:13:46.633009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:51.233 [2024-07-23 17:13:46.633029] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:51.233 [2024-07-23 17:13:46.634389] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.233 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.491 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.491 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.491 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.491 "name": "Existed_Raid", 00:19:51.491 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:51.491 "strip_size_kb": 64, 00:19:51.491 "state": "configuring", 00:19:51.491 "raid_level": "concat", 00:19:51.491 "superblock": true, 00:19:51.491 "num_base_bdevs": 3, 00:19:51.491 "num_base_bdevs_discovered": 2, 00:19:51.491 "num_base_bdevs_operational": 3, 00:19:51.491 "base_bdevs_list": [ 00:19:51.491 { 00:19:51.491 "name": "BaseBdev1", 00:19:51.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.491 "is_configured": false, 00:19:51.491 "data_offset": 0, 00:19:51.491 "data_size": 0 00:19:51.491 }, 00:19:51.491 { 00:19:51.491 "name": "BaseBdev2", 00:19:51.491 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:51.491 "is_configured": true, 00:19:51.491 "data_offset": 2048, 00:19:51.491 "data_size": 63488 00:19:51.491 }, 00:19:51.491 { 00:19:51.491 "name": "BaseBdev3", 00:19:51.491 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:51.491 "is_configured": true, 00:19:51.491 "data_offset": 2048, 00:19:51.491 "data_size": 63488 00:19:51.491 } 00:19:51.491 ] 00:19:51.491 }' 00:19:51.491 17:13:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.491 17:13:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:52.425 [2024-07-23 17:13:47.731823] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.425 17:13:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.682 17:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.682 "name": "Existed_Raid", 00:19:52.682 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:52.682 "strip_size_kb": 64, 00:19:52.682 "state": "configuring", 00:19:52.682 "raid_level": "concat", 00:19:52.682 "superblock": true, 00:19:52.682 "num_base_bdevs": 3, 00:19:52.683 "num_base_bdevs_discovered": 1, 00:19:52.683 "num_base_bdevs_operational": 3, 00:19:52.683 "base_bdevs_list": [ 00:19:52.683 { 00:19:52.683 "name": "BaseBdev1", 00:19:52.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.683 "is_configured": false, 00:19:52.683 "data_offset": 0, 00:19:52.683 "data_size": 0 00:19:52.683 }, 00:19:52.683 { 00:19:52.683 "name": null, 00:19:52.683 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:52.683 "is_configured": false, 00:19:52.683 "data_offset": 2048, 00:19:52.683 "data_size": 63488 00:19:52.683 }, 00:19:52.683 { 00:19:52.683 "name": "BaseBdev3", 00:19:52.683 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:52.683 "is_configured": true, 00:19:52.683 "data_offset": 2048, 00:19:52.683 "data_size": 63488 00:19:52.683 } 00:19:52.683 ] 00:19:52.683 }' 00:19:52.683 17:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.683 17:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:53.248 17:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.248 17:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:53.506 17:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:53.506 17:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:53.764 [2024-07-23 17:13:49.063948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:53.764 BaseBdev1 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:53.764 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:54.022 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:54.280 [ 00:19:54.280 { 00:19:54.280 "name": "BaseBdev1", 00:19:54.280 "aliases": [ 00:19:54.280 "6dbcb144-267e-48eb-b06d-a841d57feb5b" 00:19:54.280 ], 00:19:54.280 "product_name": "Malloc disk", 00:19:54.280 "block_size": 512, 00:19:54.280 "num_blocks": 65536, 00:19:54.280 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:19:54.280 "assigned_rate_limits": { 00:19:54.280 "rw_ios_per_sec": 0, 00:19:54.280 "rw_mbytes_per_sec": 0, 00:19:54.280 "r_mbytes_per_sec": 0, 00:19:54.280 "w_mbytes_per_sec": 0 00:19:54.280 }, 00:19:54.280 "claimed": true, 00:19:54.280 "claim_type": "exclusive_write", 00:19:54.280 "zoned": false, 00:19:54.280 "supported_io_types": { 00:19:54.280 "read": true, 00:19:54.280 "write": true, 00:19:54.280 "unmap": true, 00:19:54.280 "flush": true, 00:19:54.280 "reset": true, 00:19:54.280 "nvme_admin": false, 00:19:54.280 "nvme_io": false, 00:19:54.280 "nvme_io_md": false, 00:19:54.280 "write_zeroes": true, 00:19:54.280 "zcopy": true, 00:19:54.280 "get_zone_info": false, 00:19:54.280 "zone_management": false, 00:19:54.280 "zone_append": false, 00:19:54.280 "compare": false, 00:19:54.280 "compare_and_write": false, 00:19:54.280 "abort": true, 00:19:54.280 "seek_hole": false, 00:19:54.280 "seek_data": false, 00:19:54.280 "copy": true, 00:19:54.280 "nvme_iov_md": false 00:19:54.280 }, 00:19:54.280 "memory_domains": [ 00:19:54.280 { 00:19:54.280 "dma_device_id": "system", 00:19:54.280 "dma_device_type": 1 00:19:54.280 }, 00:19:54.280 { 00:19:54.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.280 "dma_device_type": 2 00:19:54.280 } 00:19:54.280 ], 00:19:54.281 "driver_specific": {} 00:19:54.281 } 00:19:54.281 ] 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.281 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:54.539 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.539 "name": "Existed_Raid", 00:19:54.539 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:54.539 "strip_size_kb": 64, 00:19:54.539 "state": "configuring", 00:19:54.539 "raid_level": "concat", 00:19:54.539 "superblock": true, 00:19:54.539 "num_base_bdevs": 3, 00:19:54.539 "num_base_bdevs_discovered": 2, 00:19:54.539 "num_base_bdevs_operational": 3, 00:19:54.539 "base_bdevs_list": [ 00:19:54.539 { 00:19:54.539 "name": "BaseBdev1", 00:19:54.539 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:19:54.539 "is_configured": true, 00:19:54.539 "data_offset": 2048, 00:19:54.539 "data_size": 63488 00:19:54.539 }, 00:19:54.539 { 00:19:54.539 "name": null, 00:19:54.539 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:54.539 "is_configured": false, 00:19:54.539 "data_offset": 2048, 00:19:54.539 "data_size": 63488 00:19:54.539 }, 00:19:54.539 { 00:19:54.539 "name": "BaseBdev3", 00:19:54.539 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:54.539 "is_configured": true, 00:19:54.539 "data_offset": 2048, 00:19:54.539 "data_size": 63488 00:19:54.539 } 00:19:54.539 ] 00:19:54.539 }' 00:19:54.539 17:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.539 17:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:55.105 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.105 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:55.362 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:55.362 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:55.620 [2024-07-23 17:13:50.880778] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.620 17:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:55.877 17:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.877 "name": "Existed_Raid", 00:19:55.877 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:55.877 "strip_size_kb": 64, 00:19:55.877 "state": "configuring", 00:19:55.877 "raid_level": "concat", 00:19:55.877 "superblock": true, 00:19:55.877 "num_base_bdevs": 3, 00:19:55.877 "num_base_bdevs_discovered": 1, 00:19:55.877 "num_base_bdevs_operational": 3, 00:19:55.877 "base_bdevs_list": [ 00:19:55.877 { 00:19:55.877 "name": "BaseBdev1", 00:19:55.877 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:19:55.877 "is_configured": true, 00:19:55.877 "data_offset": 2048, 00:19:55.877 "data_size": 63488 00:19:55.877 }, 00:19:55.877 { 00:19:55.877 "name": null, 00:19:55.877 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:55.877 "is_configured": false, 00:19:55.877 "data_offset": 2048, 00:19:55.878 "data_size": 63488 00:19:55.878 }, 00:19:55.878 { 00:19:55.878 "name": null, 00:19:55.878 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:55.878 "is_configured": false, 00:19:55.878 "data_offset": 2048, 00:19:55.878 "data_size": 63488 00:19:55.878 } 00:19:55.878 ] 00:19:55.878 }' 00:19:55.878 17:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.878 17:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:56.444 17:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.444 17:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:56.702 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:56.702 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:56.961 [2024-07-23 17:13:52.236388] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.961 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:57.219 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:57.219 "name": "Existed_Raid", 00:19:57.219 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:57.219 "strip_size_kb": 64, 00:19:57.219 "state": "configuring", 00:19:57.219 "raid_level": "concat", 00:19:57.219 "superblock": true, 00:19:57.219 "num_base_bdevs": 3, 00:19:57.219 "num_base_bdevs_discovered": 2, 00:19:57.219 "num_base_bdevs_operational": 3, 00:19:57.219 "base_bdevs_list": [ 00:19:57.219 { 00:19:57.219 "name": "BaseBdev1", 00:19:57.219 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:19:57.219 "is_configured": true, 00:19:57.219 "data_offset": 2048, 00:19:57.219 "data_size": 63488 00:19:57.219 }, 00:19:57.219 { 00:19:57.219 "name": null, 00:19:57.219 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:57.219 "is_configured": false, 00:19:57.219 "data_offset": 2048, 00:19:57.219 "data_size": 63488 00:19:57.219 }, 00:19:57.219 { 00:19:57.219 "name": "BaseBdev3", 00:19:57.219 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:57.219 "is_configured": true, 00:19:57.219 "data_offset": 2048, 00:19:57.219 "data_size": 63488 00:19:57.219 } 00:19:57.219 ] 00:19:57.219 }' 00:19:57.219 17:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:57.219 17:13:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:57.785 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.785 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:58.044 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:58.044 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:58.302 [2024-07-23 17:13:53.583972] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.302 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.559 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.559 "name": "Existed_Raid", 00:19:58.559 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:58.559 "strip_size_kb": 64, 00:19:58.559 "state": "configuring", 00:19:58.559 "raid_level": "concat", 00:19:58.559 "superblock": true, 00:19:58.559 "num_base_bdevs": 3, 00:19:58.559 "num_base_bdevs_discovered": 1, 00:19:58.559 "num_base_bdevs_operational": 3, 00:19:58.559 "base_bdevs_list": [ 00:19:58.559 { 00:19:58.559 "name": null, 00:19:58.559 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:19:58.559 "is_configured": false, 00:19:58.559 "data_offset": 2048, 00:19:58.560 "data_size": 63488 00:19:58.560 }, 00:19:58.560 { 00:19:58.560 "name": null, 00:19:58.560 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:58.560 "is_configured": false, 00:19:58.560 "data_offset": 2048, 00:19:58.560 "data_size": 63488 00:19:58.560 }, 00:19:58.560 { 00:19:58.560 "name": "BaseBdev3", 00:19:58.560 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:58.560 "is_configured": true, 00:19:58.560 "data_offset": 2048, 00:19:58.560 "data_size": 63488 00:19:58.560 } 00:19:58.560 ] 00:19:58.560 }' 00:19:58.560 17:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.560 17:13:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:59.126 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.126 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:59.384 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:59.384 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:59.642 [2024-07-23 17:13:54.980077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.642 17:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.642 17:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.642 17:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:59.901 17:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.901 "name": "Existed_Raid", 00:19:59.901 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:19:59.901 "strip_size_kb": 64, 00:19:59.901 "state": "configuring", 00:19:59.901 "raid_level": "concat", 00:19:59.901 "superblock": true, 00:19:59.901 "num_base_bdevs": 3, 00:19:59.901 "num_base_bdevs_discovered": 2, 00:19:59.901 "num_base_bdevs_operational": 3, 00:19:59.901 "base_bdevs_list": [ 00:19:59.901 { 00:19:59.901 "name": null, 00:19:59.901 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:19:59.901 "is_configured": false, 00:19:59.901 "data_offset": 2048, 00:19:59.901 "data_size": 63488 00:19:59.901 }, 00:19:59.901 { 00:19:59.901 "name": "BaseBdev2", 00:19:59.901 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:19:59.901 "is_configured": true, 00:19:59.901 "data_offset": 2048, 00:19:59.901 "data_size": 63488 00:19:59.901 }, 00:19:59.901 { 00:19:59.901 "name": "BaseBdev3", 00:19:59.901 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:19:59.901 "is_configured": true, 00:19:59.901 "data_offset": 2048, 00:19:59.901 "data_size": 63488 00:19:59.901 } 00:19:59.901 ] 00:19:59.901 }' 00:19:59.901 17:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.901 17:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.834 17:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.834 17:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:01.091 17:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:01.091 17:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.091 17:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:01.348 17:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6dbcb144-267e-48eb-b06d-a841d57feb5b 00:20:01.605 [2024-07-23 17:13:56.977942] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:01.605 [2024-07-23 17:13:56.978100] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ff2b90 00:20:01.605 [2024-07-23 17:13:56.978115] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:20:01.605 [2024-07-23 17:13:56.978291] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff2e70 00:20:01.605 [2024-07-23 17:13:56.978401] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ff2b90 00:20:01.605 [2024-07-23 17:13:56.978411] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ff2b90 00:20:01.605 [2024-07-23 17:13:56.978498] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:01.605 NewBaseBdev 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.605 17:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.879 17:13:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:02.149 [ 00:20:02.149 { 00:20:02.149 "name": "NewBaseBdev", 00:20:02.149 "aliases": [ 00:20:02.149 "6dbcb144-267e-48eb-b06d-a841d57feb5b" 00:20:02.149 ], 00:20:02.149 "product_name": "Malloc disk", 00:20:02.149 "block_size": 512, 00:20:02.149 "num_blocks": 65536, 00:20:02.149 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:20:02.149 "assigned_rate_limits": { 00:20:02.149 "rw_ios_per_sec": 0, 00:20:02.149 "rw_mbytes_per_sec": 0, 00:20:02.149 "r_mbytes_per_sec": 0, 00:20:02.149 "w_mbytes_per_sec": 0 00:20:02.149 }, 00:20:02.149 "claimed": true, 00:20:02.149 "claim_type": "exclusive_write", 00:20:02.149 "zoned": false, 00:20:02.149 "supported_io_types": { 00:20:02.149 "read": true, 00:20:02.149 "write": true, 00:20:02.149 "unmap": true, 00:20:02.149 "flush": true, 00:20:02.149 "reset": true, 00:20:02.149 "nvme_admin": false, 00:20:02.149 "nvme_io": false, 00:20:02.149 "nvme_io_md": false, 00:20:02.149 "write_zeroes": true, 00:20:02.149 "zcopy": true, 00:20:02.149 "get_zone_info": false, 00:20:02.149 "zone_management": false, 00:20:02.149 "zone_append": false, 00:20:02.149 "compare": false, 00:20:02.149 "compare_and_write": false, 00:20:02.149 "abort": true, 00:20:02.149 "seek_hole": false, 00:20:02.149 "seek_data": false, 00:20:02.149 "copy": true, 00:20:02.149 "nvme_iov_md": false 00:20:02.149 }, 00:20:02.149 "memory_domains": [ 00:20:02.149 { 00:20:02.149 "dma_device_id": "system", 00:20:02.149 "dma_device_type": 1 00:20:02.149 }, 00:20:02.149 { 00:20:02.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:02.149 "dma_device_type": 2 00:20:02.149 } 00:20:02.149 ], 00:20:02.149 "driver_specific": {} 00:20:02.149 } 00:20:02.149 ] 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.149 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.716 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.716 "name": "Existed_Raid", 00:20:02.716 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:20:02.716 "strip_size_kb": 64, 00:20:02.716 "state": "online", 00:20:02.716 "raid_level": "concat", 00:20:02.716 "superblock": true, 00:20:02.716 "num_base_bdevs": 3, 00:20:02.716 "num_base_bdevs_discovered": 3, 00:20:02.716 "num_base_bdevs_operational": 3, 00:20:02.716 "base_bdevs_list": [ 00:20:02.716 { 00:20:02.716 "name": "NewBaseBdev", 00:20:02.716 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:20:02.716 "is_configured": true, 00:20:02.716 "data_offset": 2048, 00:20:02.716 "data_size": 63488 00:20:02.716 }, 00:20:02.716 { 00:20:02.716 "name": "BaseBdev2", 00:20:02.716 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:20:02.716 "is_configured": true, 00:20:02.716 "data_offset": 2048, 00:20:02.716 "data_size": 63488 00:20:02.716 }, 00:20:02.716 { 00:20:02.716 "name": "BaseBdev3", 00:20:02.716 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:20:02.716 "is_configured": true, 00:20:02.716 "data_offset": 2048, 00:20:02.716 "data_size": 63488 00:20:02.716 } 00:20:02.716 ] 00:20:02.716 }' 00:20:02.716 17:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.716 17:13:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:03.283 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:03.542 [2024-07-23 17:13:58.851193] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:03.542 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:03.542 "name": "Existed_Raid", 00:20:03.542 "aliases": [ 00:20:03.542 "66b3d9df-d6a3-4265-aabe-cc28a3a83d22" 00:20:03.542 ], 00:20:03.542 "product_name": "Raid Volume", 00:20:03.542 "block_size": 512, 00:20:03.542 "num_blocks": 190464, 00:20:03.542 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:20:03.542 "assigned_rate_limits": { 00:20:03.542 "rw_ios_per_sec": 0, 00:20:03.542 "rw_mbytes_per_sec": 0, 00:20:03.542 "r_mbytes_per_sec": 0, 00:20:03.542 "w_mbytes_per_sec": 0 00:20:03.542 }, 00:20:03.542 "claimed": false, 00:20:03.542 "zoned": false, 00:20:03.542 "supported_io_types": { 00:20:03.542 "read": true, 00:20:03.542 "write": true, 00:20:03.542 "unmap": true, 00:20:03.542 "flush": true, 00:20:03.542 "reset": true, 00:20:03.542 "nvme_admin": false, 00:20:03.542 "nvme_io": false, 00:20:03.542 "nvme_io_md": false, 00:20:03.542 "write_zeroes": true, 00:20:03.542 "zcopy": false, 00:20:03.542 "get_zone_info": false, 00:20:03.542 "zone_management": false, 00:20:03.542 "zone_append": false, 00:20:03.542 "compare": false, 00:20:03.542 "compare_and_write": false, 00:20:03.542 "abort": false, 00:20:03.542 "seek_hole": false, 00:20:03.542 "seek_data": false, 00:20:03.542 "copy": false, 00:20:03.542 "nvme_iov_md": false 00:20:03.542 }, 00:20:03.542 "memory_domains": [ 00:20:03.542 { 00:20:03.542 "dma_device_id": "system", 00:20:03.542 "dma_device_type": 1 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.542 "dma_device_type": 2 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "dma_device_id": "system", 00:20:03.542 "dma_device_type": 1 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.542 "dma_device_type": 2 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "dma_device_id": "system", 00:20:03.542 "dma_device_type": 1 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.542 "dma_device_type": 2 00:20:03.542 } 00:20:03.542 ], 00:20:03.542 "driver_specific": { 00:20:03.542 "raid": { 00:20:03.542 "uuid": "66b3d9df-d6a3-4265-aabe-cc28a3a83d22", 00:20:03.542 "strip_size_kb": 64, 00:20:03.542 "state": "online", 00:20:03.542 "raid_level": "concat", 00:20:03.542 "superblock": true, 00:20:03.542 "num_base_bdevs": 3, 00:20:03.542 "num_base_bdevs_discovered": 3, 00:20:03.542 "num_base_bdevs_operational": 3, 00:20:03.542 "base_bdevs_list": [ 00:20:03.542 { 00:20:03.542 "name": "NewBaseBdev", 00:20:03.542 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:20:03.542 "is_configured": true, 00:20:03.542 "data_offset": 2048, 00:20:03.542 "data_size": 63488 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "name": "BaseBdev2", 00:20:03.542 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:20:03.542 "is_configured": true, 00:20:03.542 "data_offset": 2048, 00:20:03.542 "data_size": 63488 00:20:03.542 }, 00:20:03.542 { 00:20:03.542 "name": "BaseBdev3", 00:20:03.542 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:20:03.542 "is_configured": true, 00:20:03.542 "data_offset": 2048, 00:20:03.542 "data_size": 63488 00:20:03.542 } 00:20:03.542 ] 00:20:03.542 } 00:20:03.542 } 00:20:03.542 }' 00:20:03.542 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:03.542 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:03.542 BaseBdev2 00:20:03.542 BaseBdev3' 00:20:03.542 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:03.542 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:03.542 17:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:03.801 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:03.801 "name": "NewBaseBdev", 00:20:03.801 "aliases": [ 00:20:03.801 "6dbcb144-267e-48eb-b06d-a841d57feb5b" 00:20:03.801 ], 00:20:03.801 "product_name": "Malloc disk", 00:20:03.801 "block_size": 512, 00:20:03.801 "num_blocks": 65536, 00:20:03.801 "uuid": "6dbcb144-267e-48eb-b06d-a841d57feb5b", 00:20:03.801 "assigned_rate_limits": { 00:20:03.801 "rw_ios_per_sec": 0, 00:20:03.801 "rw_mbytes_per_sec": 0, 00:20:03.801 "r_mbytes_per_sec": 0, 00:20:03.801 "w_mbytes_per_sec": 0 00:20:03.801 }, 00:20:03.801 "claimed": true, 00:20:03.801 "claim_type": "exclusive_write", 00:20:03.801 "zoned": false, 00:20:03.801 "supported_io_types": { 00:20:03.801 "read": true, 00:20:03.801 "write": true, 00:20:03.801 "unmap": true, 00:20:03.801 "flush": true, 00:20:03.801 "reset": true, 00:20:03.801 "nvme_admin": false, 00:20:03.801 "nvme_io": false, 00:20:03.801 "nvme_io_md": false, 00:20:03.801 "write_zeroes": true, 00:20:03.801 "zcopy": true, 00:20:03.801 "get_zone_info": false, 00:20:03.801 "zone_management": false, 00:20:03.801 "zone_append": false, 00:20:03.801 "compare": false, 00:20:03.801 "compare_and_write": false, 00:20:03.801 "abort": true, 00:20:03.801 "seek_hole": false, 00:20:03.801 "seek_data": false, 00:20:03.801 "copy": true, 00:20:03.801 "nvme_iov_md": false 00:20:03.801 }, 00:20:03.801 "memory_domains": [ 00:20:03.801 { 00:20:03.801 "dma_device_id": "system", 00:20:03.801 "dma_device_type": 1 00:20:03.801 }, 00:20:03.801 { 00:20:03.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.801 "dma_device_type": 2 00:20:03.801 } 00:20:03.801 ], 00:20:03.801 "driver_specific": {} 00:20:03.801 }' 00:20:03.801 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.801 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:04.060 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.319 "name": "BaseBdev2", 00:20:04.319 "aliases": [ 00:20:04.319 "d99d6e82-5ecc-487f-8878-6b00b57a26a3" 00:20:04.319 ], 00:20:04.319 "product_name": "Malloc disk", 00:20:04.319 "block_size": 512, 00:20:04.319 "num_blocks": 65536, 00:20:04.319 "uuid": "d99d6e82-5ecc-487f-8878-6b00b57a26a3", 00:20:04.319 "assigned_rate_limits": { 00:20:04.319 "rw_ios_per_sec": 0, 00:20:04.319 "rw_mbytes_per_sec": 0, 00:20:04.319 "r_mbytes_per_sec": 0, 00:20:04.319 "w_mbytes_per_sec": 0 00:20:04.319 }, 00:20:04.319 "claimed": true, 00:20:04.319 "claim_type": "exclusive_write", 00:20:04.319 "zoned": false, 00:20:04.319 "supported_io_types": { 00:20:04.319 "read": true, 00:20:04.319 "write": true, 00:20:04.319 "unmap": true, 00:20:04.319 "flush": true, 00:20:04.319 "reset": true, 00:20:04.319 "nvme_admin": false, 00:20:04.319 "nvme_io": false, 00:20:04.319 "nvme_io_md": false, 00:20:04.319 "write_zeroes": true, 00:20:04.319 "zcopy": true, 00:20:04.319 "get_zone_info": false, 00:20:04.319 "zone_management": false, 00:20:04.319 "zone_append": false, 00:20:04.319 "compare": false, 00:20:04.319 "compare_and_write": false, 00:20:04.319 "abort": true, 00:20:04.319 "seek_hole": false, 00:20:04.319 "seek_data": false, 00:20:04.319 "copy": true, 00:20:04.319 "nvme_iov_md": false 00:20:04.319 }, 00:20:04.319 "memory_domains": [ 00:20:04.319 { 00:20:04.319 "dma_device_id": "system", 00:20:04.319 "dma_device_type": 1 00:20:04.319 }, 00:20:04.319 { 00:20:04.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.319 "dma_device_type": 2 00:20:04.319 } 00:20:04.319 ], 00:20:04.319 "driver_specific": {} 00:20:04.319 }' 00:20:04.319 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.577 17:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:04.836 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:05.094 "name": "BaseBdev3", 00:20:05.094 "aliases": [ 00:20:05.094 "b484ba04-653e-44c6-8574-71e2f37ead55" 00:20:05.094 ], 00:20:05.094 "product_name": "Malloc disk", 00:20:05.094 "block_size": 512, 00:20:05.094 "num_blocks": 65536, 00:20:05.094 "uuid": "b484ba04-653e-44c6-8574-71e2f37ead55", 00:20:05.094 "assigned_rate_limits": { 00:20:05.094 "rw_ios_per_sec": 0, 00:20:05.094 "rw_mbytes_per_sec": 0, 00:20:05.094 "r_mbytes_per_sec": 0, 00:20:05.094 "w_mbytes_per_sec": 0 00:20:05.094 }, 00:20:05.094 "claimed": true, 00:20:05.094 "claim_type": "exclusive_write", 00:20:05.094 "zoned": false, 00:20:05.094 "supported_io_types": { 00:20:05.094 "read": true, 00:20:05.094 "write": true, 00:20:05.094 "unmap": true, 00:20:05.094 "flush": true, 00:20:05.094 "reset": true, 00:20:05.094 "nvme_admin": false, 00:20:05.094 "nvme_io": false, 00:20:05.094 "nvme_io_md": false, 00:20:05.094 "write_zeroes": true, 00:20:05.094 "zcopy": true, 00:20:05.094 "get_zone_info": false, 00:20:05.094 "zone_management": false, 00:20:05.094 "zone_append": false, 00:20:05.094 "compare": false, 00:20:05.094 "compare_and_write": false, 00:20:05.094 "abort": true, 00:20:05.094 "seek_hole": false, 00:20:05.094 "seek_data": false, 00:20:05.094 "copy": true, 00:20:05.094 "nvme_iov_md": false 00:20:05.094 }, 00:20:05.094 "memory_domains": [ 00:20:05.094 { 00:20:05.094 "dma_device_id": "system", 00:20:05.094 "dma_device_type": 1 00:20:05.094 }, 00:20:05.094 { 00:20:05.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.094 "dma_device_type": 2 00:20:05.094 } 00:20:05.094 ], 00:20:05.094 "driver_specific": {} 00:20:05.094 }' 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:05.094 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.352 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.352 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.353 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.353 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.353 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.353 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:05.611 [2024-07-23 17:14:00.848207] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:05.611 [2024-07-23 17:14:00.848235] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:05.611 [2024-07-23 17:14:00.848283] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:05.611 [2024-07-23 17:14:00.848337] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:05.611 [2024-07-23 17:14:00.848349] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ff2b90 name Existed_Raid, state offline 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4147678 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4147678 ']' 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4147678 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4147678 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4147678' 00:20:05.611 killing process with pid 4147678 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4147678 00:20:05.611 [2024-07-23 17:14:00.921250] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:05.611 17:14:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4147678 00:20:05.611 [2024-07-23 17:14:00.948706] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:05.870 17:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:05.870 00:20:05.870 real 0m29.298s 00:20:05.870 user 0m53.858s 00:20:05.870 sys 0m5.212s 00:20:05.870 17:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:05.870 17:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.870 ************************************ 00:20:05.870 END TEST raid_state_function_test_sb 00:20:05.870 ************************************ 00:20:05.870 17:14:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:05.870 17:14:01 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:20:05.870 17:14:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:05.870 17:14:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:05.870 17:14:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:05.870 ************************************ 00:20:05.870 START TEST raid_superblock_test 00:20:05.870 ************************************ 00:20:05.870 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:20:05.870 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4151982 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4151982 /var/tmp/spdk-raid.sock 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4151982 ']' 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:05.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:05.871 17:14:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.130 [2024-07-23 17:14:01.311968] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:20:06.130 [2024-07-23 17:14:01.312040] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4151982 ] 00:20:06.130 [2024-07-23 17:14:01.446291] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:06.130 [2024-07-23 17:14:01.502017] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:06.388 [2024-07-23 17:14:01.566466] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:06.388 [2024-07-23 17:14:01.566507] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:06.956 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:07.215 malloc1 00:20:07.215 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:07.474 [2024-07-23 17:14:02.736182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:07.474 [2024-07-23 17:14:02.736233] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.474 [2024-07-23 17:14:02.736256] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe3070 00:20:07.474 [2024-07-23 17:14:02.736269] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.474 [2024-07-23 17:14:02.737803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.474 [2024-07-23 17:14:02.737831] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:07.474 pt1 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:07.474 17:14:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:07.733 malloc2 00:20:07.733 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:07.993 [2024-07-23 17:14:03.242319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:07.993 [2024-07-23 17:14:03.242366] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.993 [2024-07-23 17:14:03.242383] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xac9920 00:20:07.993 [2024-07-23 17:14:03.242395] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.993 [2024-07-23 17:14:03.243959] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.993 [2024-07-23 17:14:03.243986] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:07.993 pt2 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:07.993 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:08.251 malloc3 00:20:08.252 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:08.510 [2024-07-23 17:14:03.744211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:08.510 [2024-07-23 17:14:03.744257] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:08.510 [2024-07-23 17:14:03.744275] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbdb3e0 00:20:08.510 [2024-07-23 17:14:03.744288] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:08.510 [2024-07-23 17:14:03.745685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:08.510 [2024-07-23 17:14:03.745711] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:08.510 pt3 00:20:08.510 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:08.510 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:08.510 17:14:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:20:08.769 [2024-07-23 17:14:03.992909] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:08.769 [2024-07-23 17:14:03.994067] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:08.769 [2024-07-23 17:14:03.994121] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:08.769 [2024-07-23 17:14:03.994269] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xbdd870 00:20:08.769 [2024-07-23 17:14:03.994281] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:20:08.769 [2024-07-23 17:14:03.994461] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa485e0 00:20:08.769 [2024-07-23 17:14:03.994597] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbdd870 00:20:08.769 [2024-07-23 17:14:03.994607] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbdd870 00:20:08.769 [2024-07-23 17:14:03.994696] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.769 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:09.028 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.028 "name": "raid_bdev1", 00:20:09.028 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:09.028 "strip_size_kb": 64, 00:20:09.028 "state": "online", 00:20:09.028 "raid_level": "concat", 00:20:09.028 "superblock": true, 00:20:09.028 "num_base_bdevs": 3, 00:20:09.028 "num_base_bdevs_discovered": 3, 00:20:09.028 "num_base_bdevs_operational": 3, 00:20:09.028 "base_bdevs_list": [ 00:20:09.028 { 00:20:09.028 "name": "pt1", 00:20:09.028 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:09.028 "is_configured": true, 00:20:09.028 "data_offset": 2048, 00:20:09.028 "data_size": 63488 00:20:09.028 }, 00:20:09.028 { 00:20:09.028 "name": "pt2", 00:20:09.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:09.028 "is_configured": true, 00:20:09.028 "data_offset": 2048, 00:20:09.028 "data_size": 63488 00:20:09.028 }, 00:20:09.028 { 00:20:09.028 "name": "pt3", 00:20:09.028 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:09.028 "is_configured": true, 00:20:09.028 "data_offset": 2048, 00:20:09.028 "data_size": 63488 00:20:09.028 } 00:20:09.028 ] 00:20:09.028 }' 00:20:09.028 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.028 17:14:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:09.595 [2024-07-23 17:14:04.955683] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:09.595 "name": "raid_bdev1", 00:20:09.595 "aliases": [ 00:20:09.595 "29db5de6-5533-43e2-8d2e-6c7b9241cda4" 00:20:09.595 ], 00:20:09.595 "product_name": "Raid Volume", 00:20:09.595 "block_size": 512, 00:20:09.595 "num_blocks": 190464, 00:20:09.595 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:09.595 "assigned_rate_limits": { 00:20:09.595 "rw_ios_per_sec": 0, 00:20:09.595 "rw_mbytes_per_sec": 0, 00:20:09.595 "r_mbytes_per_sec": 0, 00:20:09.595 "w_mbytes_per_sec": 0 00:20:09.595 }, 00:20:09.595 "claimed": false, 00:20:09.595 "zoned": false, 00:20:09.595 "supported_io_types": { 00:20:09.595 "read": true, 00:20:09.595 "write": true, 00:20:09.595 "unmap": true, 00:20:09.595 "flush": true, 00:20:09.595 "reset": true, 00:20:09.595 "nvme_admin": false, 00:20:09.595 "nvme_io": false, 00:20:09.595 "nvme_io_md": false, 00:20:09.595 "write_zeroes": true, 00:20:09.595 "zcopy": false, 00:20:09.595 "get_zone_info": false, 00:20:09.595 "zone_management": false, 00:20:09.595 "zone_append": false, 00:20:09.595 "compare": false, 00:20:09.595 "compare_and_write": false, 00:20:09.595 "abort": false, 00:20:09.595 "seek_hole": false, 00:20:09.595 "seek_data": false, 00:20:09.595 "copy": false, 00:20:09.595 "nvme_iov_md": false 00:20:09.595 }, 00:20:09.595 "memory_domains": [ 00:20:09.595 { 00:20:09.595 "dma_device_id": "system", 00:20:09.595 "dma_device_type": 1 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.595 "dma_device_type": 2 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "dma_device_id": "system", 00:20:09.595 "dma_device_type": 1 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.595 "dma_device_type": 2 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "dma_device_id": "system", 00:20:09.595 "dma_device_type": 1 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.595 "dma_device_type": 2 00:20:09.595 } 00:20:09.595 ], 00:20:09.595 "driver_specific": { 00:20:09.595 "raid": { 00:20:09.595 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:09.595 "strip_size_kb": 64, 00:20:09.595 "state": "online", 00:20:09.595 "raid_level": "concat", 00:20:09.595 "superblock": true, 00:20:09.595 "num_base_bdevs": 3, 00:20:09.595 "num_base_bdevs_discovered": 3, 00:20:09.595 "num_base_bdevs_operational": 3, 00:20:09.595 "base_bdevs_list": [ 00:20:09.595 { 00:20:09.595 "name": "pt1", 00:20:09.595 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:09.595 "is_configured": true, 00:20:09.595 "data_offset": 2048, 00:20:09.595 "data_size": 63488 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "name": "pt2", 00:20:09.595 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:09.595 "is_configured": true, 00:20:09.595 "data_offset": 2048, 00:20:09.595 "data_size": 63488 00:20:09.595 }, 00:20:09.595 { 00:20:09.595 "name": "pt3", 00:20:09.595 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:09.595 "is_configured": true, 00:20:09.595 "data_offset": 2048, 00:20:09.595 "data_size": 63488 00:20:09.595 } 00:20:09.595 ] 00:20:09.595 } 00:20:09.595 } 00:20:09.595 }' 00:20:09.595 17:14:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:09.854 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:09.854 pt2 00:20:09.854 pt3' 00:20:09.854 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:09.855 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:09.855 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:09.855 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:09.855 "name": "pt1", 00:20:09.855 "aliases": [ 00:20:09.855 "00000000-0000-0000-0000-000000000001" 00:20:09.855 ], 00:20:09.855 "product_name": "passthru", 00:20:09.855 "block_size": 512, 00:20:09.855 "num_blocks": 65536, 00:20:09.855 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:09.855 "assigned_rate_limits": { 00:20:09.855 "rw_ios_per_sec": 0, 00:20:09.855 "rw_mbytes_per_sec": 0, 00:20:09.855 "r_mbytes_per_sec": 0, 00:20:09.855 "w_mbytes_per_sec": 0 00:20:09.855 }, 00:20:09.855 "claimed": true, 00:20:09.855 "claim_type": "exclusive_write", 00:20:09.855 "zoned": false, 00:20:09.855 "supported_io_types": { 00:20:09.855 "read": true, 00:20:09.855 "write": true, 00:20:09.855 "unmap": true, 00:20:09.855 "flush": true, 00:20:09.855 "reset": true, 00:20:09.855 "nvme_admin": false, 00:20:09.855 "nvme_io": false, 00:20:09.855 "nvme_io_md": false, 00:20:09.855 "write_zeroes": true, 00:20:09.855 "zcopy": true, 00:20:09.855 "get_zone_info": false, 00:20:09.855 "zone_management": false, 00:20:09.855 "zone_append": false, 00:20:09.855 "compare": false, 00:20:09.855 "compare_and_write": false, 00:20:09.855 "abort": true, 00:20:09.855 "seek_hole": false, 00:20:09.855 "seek_data": false, 00:20:09.855 "copy": true, 00:20:09.855 "nvme_iov_md": false 00:20:09.855 }, 00:20:09.855 "memory_domains": [ 00:20:09.855 { 00:20:09.855 "dma_device_id": "system", 00:20:09.855 "dma_device_type": 1 00:20:09.855 }, 00:20:09.855 { 00:20:09.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.855 "dma_device_type": 2 00:20:09.855 } 00:20:09.855 ], 00:20:09.855 "driver_specific": { 00:20:09.855 "passthru": { 00:20:09.855 "name": "pt1", 00:20:09.855 "base_bdev_name": "malloc1" 00:20:09.855 } 00:20:09.855 } 00:20:09.855 }' 00:20:09.855 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.855 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.113 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.372 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:10.372 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:10.372 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:10.372 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:10.372 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.372 "name": "pt2", 00:20:10.372 "aliases": [ 00:20:10.372 "00000000-0000-0000-0000-000000000002" 00:20:10.372 ], 00:20:10.372 "product_name": "passthru", 00:20:10.372 "block_size": 512, 00:20:10.372 "num_blocks": 65536, 00:20:10.372 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:10.372 "assigned_rate_limits": { 00:20:10.372 "rw_ios_per_sec": 0, 00:20:10.372 "rw_mbytes_per_sec": 0, 00:20:10.372 "r_mbytes_per_sec": 0, 00:20:10.372 "w_mbytes_per_sec": 0 00:20:10.372 }, 00:20:10.372 "claimed": true, 00:20:10.372 "claim_type": "exclusive_write", 00:20:10.372 "zoned": false, 00:20:10.372 "supported_io_types": { 00:20:10.372 "read": true, 00:20:10.372 "write": true, 00:20:10.372 "unmap": true, 00:20:10.372 "flush": true, 00:20:10.372 "reset": true, 00:20:10.372 "nvme_admin": false, 00:20:10.372 "nvme_io": false, 00:20:10.372 "nvme_io_md": false, 00:20:10.372 "write_zeroes": true, 00:20:10.372 "zcopy": true, 00:20:10.372 "get_zone_info": false, 00:20:10.372 "zone_management": false, 00:20:10.372 "zone_append": false, 00:20:10.372 "compare": false, 00:20:10.372 "compare_and_write": false, 00:20:10.373 "abort": true, 00:20:10.373 "seek_hole": false, 00:20:10.373 "seek_data": false, 00:20:10.373 "copy": true, 00:20:10.373 "nvme_iov_md": false 00:20:10.373 }, 00:20:10.373 "memory_domains": [ 00:20:10.373 { 00:20:10.373 "dma_device_id": "system", 00:20:10.373 "dma_device_type": 1 00:20:10.373 }, 00:20:10.373 { 00:20:10.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.373 "dma_device_type": 2 00:20:10.373 } 00:20:10.373 ], 00:20:10.373 "driver_specific": { 00:20:10.373 "passthru": { 00:20:10.373 "name": "pt2", 00:20:10.373 "base_bdev_name": "malloc2" 00:20:10.373 } 00:20:10.373 } 00:20:10.373 }' 00:20:10.373 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.631 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.631 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.631 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.631 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.631 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.631 17:14:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.631 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:10.890 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:11.148 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:11.148 "name": "pt3", 00:20:11.148 "aliases": [ 00:20:11.148 "00000000-0000-0000-0000-000000000003" 00:20:11.148 ], 00:20:11.148 "product_name": "passthru", 00:20:11.148 "block_size": 512, 00:20:11.148 "num_blocks": 65536, 00:20:11.148 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:11.148 "assigned_rate_limits": { 00:20:11.148 "rw_ios_per_sec": 0, 00:20:11.148 "rw_mbytes_per_sec": 0, 00:20:11.148 "r_mbytes_per_sec": 0, 00:20:11.148 "w_mbytes_per_sec": 0 00:20:11.148 }, 00:20:11.148 "claimed": true, 00:20:11.148 "claim_type": "exclusive_write", 00:20:11.148 "zoned": false, 00:20:11.148 "supported_io_types": { 00:20:11.148 "read": true, 00:20:11.148 "write": true, 00:20:11.148 "unmap": true, 00:20:11.148 "flush": true, 00:20:11.148 "reset": true, 00:20:11.148 "nvme_admin": false, 00:20:11.148 "nvme_io": false, 00:20:11.148 "nvme_io_md": false, 00:20:11.148 "write_zeroes": true, 00:20:11.148 "zcopy": true, 00:20:11.148 "get_zone_info": false, 00:20:11.148 "zone_management": false, 00:20:11.148 "zone_append": false, 00:20:11.148 "compare": false, 00:20:11.148 "compare_and_write": false, 00:20:11.148 "abort": true, 00:20:11.148 "seek_hole": false, 00:20:11.148 "seek_data": false, 00:20:11.148 "copy": true, 00:20:11.148 "nvme_iov_md": false 00:20:11.148 }, 00:20:11.148 "memory_domains": [ 00:20:11.148 { 00:20:11.148 "dma_device_id": "system", 00:20:11.148 "dma_device_type": 1 00:20:11.148 }, 00:20:11.148 { 00:20:11.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.148 "dma_device_type": 2 00:20:11.148 } 00:20:11.148 ], 00:20:11.148 "driver_specific": { 00:20:11.148 "passthru": { 00:20:11.148 "name": "pt3", 00:20:11.148 "base_bdev_name": "malloc3" 00:20:11.148 } 00:20:11.148 } 00:20:11.148 }' 00:20:11.148 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.148 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.148 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:11.148 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.149 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.149 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:11.149 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:11.407 17:14:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:11.665 [2024-07-23 17:14:07.001095] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:11.665 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=29db5de6-5533-43e2-8d2e-6c7b9241cda4 00:20:11.665 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 29db5de6-5533-43e2-8d2e-6c7b9241cda4 ']' 00:20:11.665 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:11.924 [2024-07-23 17:14:07.301611] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:11.924 [2024-07-23 17:14:07.301632] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:11.924 [2024-07-23 17:14:07.301682] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:11.924 [2024-07-23 17:14:07.301732] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:11.924 [2024-07-23 17:14:07.301744] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbdd870 name raid_bdev1, state offline 00:20:11.924 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.924 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:12.183 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:12.183 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:12.183 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:12.183 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:12.442 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:12.442 17:14:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:12.701 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:12.701 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:12.960 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:12.960 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:13.219 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:20:13.479 [2024-07-23 17:14:08.793490] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:13.479 [2024-07-23 17:14:08.794812] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:13.479 [2024-07-23 17:14:08.794855] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:13.479 [2024-07-23 17:14:08.794906] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:13.479 [2024-07-23 17:14:08.794947] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:13.479 [2024-07-23 17:14:08.794976] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:13.479 [2024-07-23 17:14:08.794994] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:13.479 [2024-07-23 17:14:08.795004] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac9090 name raid_bdev1, state configuring 00:20:13.479 request: 00:20:13.479 { 00:20:13.479 "name": "raid_bdev1", 00:20:13.479 "raid_level": "concat", 00:20:13.479 "base_bdevs": [ 00:20:13.479 "malloc1", 00:20:13.479 "malloc2", 00:20:13.479 "malloc3" 00:20:13.479 ], 00:20:13.479 "strip_size_kb": 64, 00:20:13.479 "superblock": false, 00:20:13.479 "method": "bdev_raid_create", 00:20:13.479 "req_id": 1 00:20:13.479 } 00:20:13.479 Got JSON-RPC error response 00:20:13.479 response: 00:20:13.479 { 00:20:13.479 "code": -17, 00:20:13.479 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:13.479 } 00:20:13.479 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:13.479 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:13.479 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:13.479 17:14:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:13.479 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.479 17:14:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:13.738 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:13.738 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:13.738 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:13.998 [2024-07-23 17:14:09.286738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:13.998 [2024-07-23 17:14:09.286784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:13.998 [2024-07-23 17:14:09.286804] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbe33b0 00:20:13.998 [2024-07-23 17:14:09.286817] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:13.998 [2024-07-23 17:14:09.288485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:13.998 [2024-07-23 17:14:09.288516] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:13.998 [2024-07-23 17:14:09.288585] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:13.998 [2024-07-23 17:14:09.288615] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:13.998 pt1 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.998 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:14.257 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.257 "name": "raid_bdev1", 00:20:14.257 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:14.257 "strip_size_kb": 64, 00:20:14.257 "state": "configuring", 00:20:14.257 "raid_level": "concat", 00:20:14.257 "superblock": true, 00:20:14.257 "num_base_bdevs": 3, 00:20:14.257 "num_base_bdevs_discovered": 1, 00:20:14.257 "num_base_bdevs_operational": 3, 00:20:14.257 "base_bdevs_list": [ 00:20:14.257 { 00:20:14.257 "name": "pt1", 00:20:14.257 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:14.257 "is_configured": true, 00:20:14.257 "data_offset": 2048, 00:20:14.257 "data_size": 63488 00:20:14.257 }, 00:20:14.257 { 00:20:14.257 "name": null, 00:20:14.257 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:14.257 "is_configured": false, 00:20:14.257 "data_offset": 2048, 00:20:14.257 "data_size": 63488 00:20:14.257 }, 00:20:14.257 { 00:20:14.257 "name": null, 00:20:14.257 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:14.257 "is_configured": false, 00:20:14.257 "data_offset": 2048, 00:20:14.257 "data_size": 63488 00:20:14.257 } 00:20:14.257 ] 00:20:14.257 }' 00:20:14.257 17:14:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.257 17:14:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.825 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:20:14.825 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:15.084 [2024-07-23 17:14:10.401718] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:15.084 [2024-07-23 17:14:10.401775] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.084 [2024-07-23 17:14:10.401794] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbde300 00:20:15.084 [2024-07-23 17:14:10.401807] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.084 [2024-07-23 17:14:10.402164] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.084 [2024-07-23 17:14:10.402180] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:15.084 [2024-07-23 17:14:10.402247] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:15.084 [2024-07-23 17:14:10.402268] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:15.084 pt2 00:20:15.084 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:15.342 [2024-07-23 17:14:10.642357] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.342 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.604 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.604 "name": "raid_bdev1", 00:20:15.604 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:15.604 "strip_size_kb": 64, 00:20:15.604 "state": "configuring", 00:20:15.604 "raid_level": "concat", 00:20:15.604 "superblock": true, 00:20:15.604 "num_base_bdevs": 3, 00:20:15.604 "num_base_bdevs_discovered": 1, 00:20:15.604 "num_base_bdevs_operational": 3, 00:20:15.604 "base_bdevs_list": [ 00:20:15.604 { 00:20:15.604 "name": "pt1", 00:20:15.604 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:15.604 "is_configured": true, 00:20:15.604 "data_offset": 2048, 00:20:15.604 "data_size": 63488 00:20:15.604 }, 00:20:15.604 { 00:20:15.604 "name": null, 00:20:15.604 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:15.604 "is_configured": false, 00:20:15.604 "data_offset": 2048, 00:20:15.604 "data_size": 63488 00:20:15.604 }, 00:20:15.604 { 00:20:15.604 "name": null, 00:20:15.604 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:15.604 "is_configured": false, 00:20:15.604 "data_offset": 2048, 00:20:15.604 "data_size": 63488 00:20:15.604 } 00:20:15.604 ] 00:20:15.604 }' 00:20:15.604 17:14:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.604 17:14:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.225 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:16.225 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:16.225 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:16.483 [2024-07-23 17:14:11.713215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:16.483 [2024-07-23 17:14:11.713263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.483 [2024-07-23 17:14:11.713281] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbde530 00:20:16.483 [2024-07-23 17:14:11.713294] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.483 [2024-07-23 17:14:11.713628] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.483 [2024-07-23 17:14:11.713644] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:16.483 [2024-07-23 17:14:11.713704] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:16.483 [2024-07-23 17:14:11.713722] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:16.483 pt2 00:20:16.483 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:16.483 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:16.483 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:16.742 [2024-07-23 17:14:11.965879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:16.742 [2024-07-23 17:14:11.965914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.742 [2024-07-23 17:14:11.965930] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa32ad0 00:20:16.742 [2024-07-23 17:14:11.965942] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.742 [2024-07-23 17:14:11.966233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.742 [2024-07-23 17:14:11.966250] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:16.742 [2024-07-23 17:14:11.966300] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:16.742 [2024-07-23 17:14:11.966317] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:16.742 [2024-07-23 17:14:11.966416] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe2560 00:20:16.742 [2024-07-23 17:14:11.966427] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:20:16.742 [2024-07-23 17:14:11.966587] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa485e0 00:20:16.742 [2024-07-23 17:14:11.966708] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe2560 00:20:16.742 [2024-07-23 17:14:11.966718] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbe2560 00:20:16.742 [2024-07-23 17:14:11.966809] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:16.742 pt3 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.742 17:14:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.001 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.001 "name": "raid_bdev1", 00:20:17.001 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:17.001 "strip_size_kb": 64, 00:20:17.001 "state": "online", 00:20:17.001 "raid_level": "concat", 00:20:17.001 "superblock": true, 00:20:17.001 "num_base_bdevs": 3, 00:20:17.001 "num_base_bdevs_discovered": 3, 00:20:17.001 "num_base_bdevs_operational": 3, 00:20:17.001 "base_bdevs_list": [ 00:20:17.001 { 00:20:17.001 "name": "pt1", 00:20:17.001 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:17.001 "is_configured": true, 00:20:17.001 "data_offset": 2048, 00:20:17.001 "data_size": 63488 00:20:17.001 }, 00:20:17.001 { 00:20:17.001 "name": "pt2", 00:20:17.001 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:17.001 "is_configured": true, 00:20:17.001 "data_offset": 2048, 00:20:17.001 "data_size": 63488 00:20:17.001 }, 00:20:17.001 { 00:20:17.001 "name": "pt3", 00:20:17.001 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:17.001 "is_configured": true, 00:20:17.001 "data_offset": 2048, 00:20:17.001 "data_size": 63488 00:20:17.001 } 00:20:17.001 ] 00:20:17.001 }' 00:20:17.001 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.001 17:14:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:17.567 17:14:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:17.825 [2024-07-23 17:14:13.004918] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:17.825 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:17.825 "name": "raid_bdev1", 00:20:17.825 "aliases": [ 00:20:17.825 "29db5de6-5533-43e2-8d2e-6c7b9241cda4" 00:20:17.825 ], 00:20:17.825 "product_name": "Raid Volume", 00:20:17.825 "block_size": 512, 00:20:17.825 "num_blocks": 190464, 00:20:17.825 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:17.825 "assigned_rate_limits": { 00:20:17.825 "rw_ios_per_sec": 0, 00:20:17.825 "rw_mbytes_per_sec": 0, 00:20:17.825 "r_mbytes_per_sec": 0, 00:20:17.825 "w_mbytes_per_sec": 0 00:20:17.825 }, 00:20:17.825 "claimed": false, 00:20:17.825 "zoned": false, 00:20:17.825 "supported_io_types": { 00:20:17.825 "read": true, 00:20:17.825 "write": true, 00:20:17.825 "unmap": true, 00:20:17.825 "flush": true, 00:20:17.825 "reset": true, 00:20:17.825 "nvme_admin": false, 00:20:17.825 "nvme_io": false, 00:20:17.825 "nvme_io_md": false, 00:20:17.825 "write_zeroes": true, 00:20:17.825 "zcopy": false, 00:20:17.825 "get_zone_info": false, 00:20:17.825 "zone_management": false, 00:20:17.825 "zone_append": false, 00:20:17.825 "compare": false, 00:20:17.825 "compare_and_write": false, 00:20:17.825 "abort": false, 00:20:17.825 "seek_hole": false, 00:20:17.825 "seek_data": false, 00:20:17.825 "copy": false, 00:20:17.825 "nvme_iov_md": false 00:20:17.825 }, 00:20:17.825 "memory_domains": [ 00:20:17.825 { 00:20:17.825 "dma_device_id": "system", 00:20:17.825 "dma_device_type": 1 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.825 "dma_device_type": 2 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "dma_device_id": "system", 00:20:17.825 "dma_device_type": 1 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.825 "dma_device_type": 2 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "dma_device_id": "system", 00:20:17.825 "dma_device_type": 1 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.825 "dma_device_type": 2 00:20:17.825 } 00:20:17.825 ], 00:20:17.825 "driver_specific": { 00:20:17.825 "raid": { 00:20:17.825 "uuid": "29db5de6-5533-43e2-8d2e-6c7b9241cda4", 00:20:17.825 "strip_size_kb": 64, 00:20:17.825 "state": "online", 00:20:17.825 "raid_level": "concat", 00:20:17.825 "superblock": true, 00:20:17.825 "num_base_bdevs": 3, 00:20:17.825 "num_base_bdevs_discovered": 3, 00:20:17.825 "num_base_bdevs_operational": 3, 00:20:17.825 "base_bdevs_list": [ 00:20:17.825 { 00:20:17.825 "name": "pt1", 00:20:17.825 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:17.825 "is_configured": true, 00:20:17.825 "data_offset": 2048, 00:20:17.825 "data_size": 63488 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "name": "pt2", 00:20:17.825 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:17.825 "is_configured": true, 00:20:17.825 "data_offset": 2048, 00:20:17.825 "data_size": 63488 00:20:17.825 }, 00:20:17.825 { 00:20:17.825 "name": "pt3", 00:20:17.825 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:17.825 "is_configured": true, 00:20:17.825 "data_offset": 2048, 00:20:17.825 "data_size": 63488 00:20:17.825 } 00:20:17.825 ] 00:20:17.825 } 00:20:17.825 } 00:20:17.825 }' 00:20:17.825 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:17.825 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:17.825 pt2 00:20:17.825 pt3' 00:20:17.825 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.825 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:17.825 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.826 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.826 "name": "pt1", 00:20:17.826 "aliases": [ 00:20:17.826 "00000000-0000-0000-0000-000000000001" 00:20:17.826 ], 00:20:17.826 "product_name": "passthru", 00:20:17.826 "block_size": 512, 00:20:17.826 "num_blocks": 65536, 00:20:17.826 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:17.826 "assigned_rate_limits": { 00:20:17.826 "rw_ios_per_sec": 0, 00:20:17.826 "rw_mbytes_per_sec": 0, 00:20:17.826 "r_mbytes_per_sec": 0, 00:20:17.826 "w_mbytes_per_sec": 0 00:20:17.826 }, 00:20:17.826 "claimed": true, 00:20:17.826 "claim_type": "exclusive_write", 00:20:17.826 "zoned": false, 00:20:17.826 "supported_io_types": { 00:20:17.826 "read": true, 00:20:17.826 "write": true, 00:20:17.826 "unmap": true, 00:20:17.826 "flush": true, 00:20:17.826 "reset": true, 00:20:17.826 "nvme_admin": false, 00:20:17.826 "nvme_io": false, 00:20:17.826 "nvme_io_md": false, 00:20:17.826 "write_zeroes": true, 00:20:17.826 "zcopy": true, 00:20:17.826 "get_zone_info": false, 00:20:17.826 "zone_management": false, 00:20:17.826 "zone_append": false, 00:20:17.826 "compare": false, 00:20:17.826 "compare_and_write": false, 00:20:17.826 "abort": true, 00:20:17.826 "seek_hole": false, 00:20:17.826 "seek_data": false, 00:20:17.826 "copy": true, 00:20:17.826 "nvme_iov_md": false 00:20:17.826 }, 00:20:17.826 "memory_domains": [ 00:20:17.826 { 00:20:17.826 "dma_device_id": "system", 00:20:17.826 "dma_device_type": 1 00:20:17.826 }, 00:20:17.826 { 00:20:17.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.826 "dma_device_type": 2 00:20:17.826 } 00:20:17.826 ], 00:20:17.826 "driver_specific": { 00:20:17.826 "passthru": { 00:20:17.826 "name": "pt1", 00:20:17.826 "base_bdev_name": "malloc1" 00:20:17.826 } 00:20:17.826 } 00:20:17.826 }' 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.084 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.341 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.341 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.341 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:18.341 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:18.341 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:18.599 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:18.599 "name": "pt2", 00:20:18.599 "aliases": [ 00:20:18.599 "00000000-0000-0000-0000-000000000002" 00:20:18.599 ], 00:20:18.599 "product_name": "passthru", 00:20:18.599 "block_size": 512, 00:20:18.599 "num_blocks": 65536, 00:20:18.599 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:18.599 "assigned_rate_limits": { 00:20:18.599 "rw_ios_per_sec": 0, 00:20:18.599 "rw_mbytes_per_sec": 0, 00:20:18.599 "r_mbytes_per_sec": 0, 00:20:18.599 "w_mbytes_per_sec": 0 00:20:18.599 }, 00:20:18.599 "claimed": true, 00:20:18.599 "claim_type": "exclusive_write", 00:20:18.599 "zoned": false, 00:20:18.599 "supported_io_types": { 00:20:18.599 "read": true, 00:20:18.599 "write": true, 00:20:18.599 "unmap": true, 00:20:18.599 "flush": true, 00:20:18.599 "reset": true, 00:20:18.599 "nvme_admin": false, 00:20:18.599 "nvme_io": false, 00:20:18.599 "nvme_io_md": false, 00:20:18.599 "write_zeroes": true, 00:20:18.599 "zcopy": true, 00:20:18.599 "get_zone_info": false, 00:20:18.599 "zone_management": false, 00:20:18.599 "zone_append": false, 00:20:18.599 "compare": false, 00:20:18.599 "compare_and_write": false, 00:20:18.599 "abort": true, 00:20:18.599 "seek_hole": false, 00:20:18.599 "seek_data": false, 00:20:18.599 "copy": true, 00:20:18.599 "nvme_iov_md": false 00:20:18.599 }, 00:20:18.599 "memory_domains": [ 00:20:18.599 { 00:20:18.599 "dma_device_id": "system", 00:20:18.599 "dma_device_type": 1 00:20:18.599 }, 00:20:18.599 { 00:20:18.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.599 "dma_device_type": 2 00:20:18.599 } 00:20:18.599 ], 00:20:18.599 "driver_specific": { 00:20:18.599 "passthru": { 00:20:18.599 "name": "pt2", 00:20:18.599 "base_bdev_name": "malloc2" 00:20:18.599 } 00:20:18.599 } 00:20:18.599 }' 00:20:18.599 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.599 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.599 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:18.599 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.599 17:14:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:18.857 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:19.115 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:19.115 "name": "pt3", 00:20:19.115 "aliases": [ 00:20:19.115 "00000000-0000-0000-0000-000000000003" 00:20:19.115 ], 00:20:19.115 "product_name": "passthru", 00:20:19.115 "block_size": 512, 00:20:19.115 "num_blocks": 65536, 00:20:19.115 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:19.115 "assigned_rate_limits": { 00:20:19.115 "rw_ios_per_sec": 0, 00:20:19.115 "rw_mbytes_per_sec": 0, 00:20:19.115 "r_mbytes_per_sec": 0, 00:20:19.115 "w_mbytes_per_sec": 0 00:20:19.115 }, 00:20:19.115 "claimed": true, 00:20:19.115 "claim_type": "exclusive_write", 00:20:19.115 "zoned": false, 00:20:19.115 "supported_io_types": { 00:20:19.115 "read": true, 00:20:19.115 "write": true, 00:20:19.115 "unmap": true, 00:20:19.115 "flush": true, 00:20:19.115 "reset": true, 00:20:19.115 "nvme_admin": false, 00:20:19.115 "nvme_io": false, 00:20:19.115 "nvme_io_md": false, 00:20:19.115 "write_zeroes": true, 00:20:19.115 "zcopy": true, 00:20:19.115 "get_zone_info": false, 00:20:19.115 "zone_management": false, 00:20:19.115 "zone_append": false, 00:20:19.115 "compare": false, 00:20:19.115 "compare_and_write": false, 00:20:19.115 "abort": true, 00:20:19.115 "seek_hole": false, 00:20:19.115 "seek_data": false, 00:20:19.115 "copy": true, 00:20:19.115 "nvme_iov_md": false 00:20:19.115 }, 00:20:19.115 "memory_domains": [ 00:20:19.115 { 00:20:19.115 "dma_device_id": "system", 00:20:19.115 "dma_device_type": 1 00:20:19.116 }, 00:20:19.116 { 00:20:19.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.116 "dma_device_type": 2 00:20:19.116 } 00:20:19.116 ], 00:20:19.116 "driver_specific": { 00:20:19.116 "passthru": { 00:20:19.116 "name": "pt3", 00:20:19.116 "base_bdev_name": "malloc3" 00:20:19.116 } 00:20:19.116 } 00:20:19.116 }' 00:20:19.116 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.116 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.116 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:19.116 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:19.116 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:19.374 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:19.374 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:19.374 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:19.374 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:19.374 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:19.374 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:19.632 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:19.632 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:19.632 17:14:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:20.199 [2024-07-23 17:14:15.327052] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 29db5de6-5533-43e2-8d2e-6c7b9241cda4 '!=' 29db5de6-5533-43e2-8d2e-6c7b9241cda4 ']' 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4151982 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4151982 ']' 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4151982 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4151982 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4151982' 00:20:20.199 killing process with pid 4151982 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4151982 00:20:20.199 [2024-07-23 17:14:15.410134] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:20.199 [2024-07-23 17:14:15.410193] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:20.199 [2024-07-23 17:14:15.410247] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:20.199 [2024-07-23 17:14:15.410259] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe2560 name raid_bdev1, state offline 00:20:20.199 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4151982 00:20:20.199 [2024-07-23 17:14:15.441573] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:20.458 17:14:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:20.458 00:20:20.458 real 0m14.403s 00:20:20.458 user 0m25.916s 00:20:20.458 sys 0m2.669s 00:20:20.458 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:20.458 17:14:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.458 ************************************ 00:20:20.458 END TEST raid_superblock_test 00:20:20.458 ************************************ 00:20:20.458 17:14:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:20.458 17:14:15 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:20:20.458 17:14:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:20.458 17:14:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:20.458 17:14:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:20.458 ************************************ 00:20:20.458 START TEST raid_read_error_test 00:20:20.458 ************************************ 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.r0L3jfP6Md 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4154192 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4154192 /var/tmp/spdk-raid.sock 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4154192 ']' 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:20.458 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:20.458 17:14:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.458 [2024-07-23 17:14:15.814401] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:20:20.458 [2024-07-23 17:14:15.814463] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4154192 ] 00:20:20.716 [2024-07-23 17:14:15.944616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:20.717 [2024-07-23 17:14:15.993813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:20.717 [2024-07-23 17:14:16.054968] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:20.717 [2024-07-23 17:14:16.055006] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:21.651 17:14:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:21.652 17:14:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:21.652 17:14:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:21.652 17:14:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:21.909 BaseBdev1_malloc 00:20:21.909 17:14:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:22.167 true 00:20:22.167 17:14:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:22.425 [2024-07-23 17:14:17.766228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:22.425 [2024-07-23 17:14:17.766273] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:22.425 [2024-07-23 17:14:17.766293] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14dc5c0 00:20:22.425 [2024-07-23 17:14:17.766305] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:22.425 [2024-07-23 17:14:17.768010] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:22.425 [2024-07-23 17:14:17.768038] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:22.425 BaseBdev1 00:20:22.425 17:14:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:22.425 17:14:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:22.683 BaseBdev2_malloc 00:20:22.683 17:14:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:22.941 true 00:20:22.941 17:14:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:23.199 [2024-07-23 17:14:18.518076] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:23.199 [2024-07-23 17:14:18.518119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.199 [2024-07-23 17:14:18.518141] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d6620 00:20:23.199 [2024-07-23 17:14:18.518159] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.199 [2024-07-23 17:14:18.519604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.199 [2024-07-23 17:14:18.519629] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:23.199 BaseBdev2 00:20:23.199 17:14:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:23.199 17:14:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:23.458 BaseBdev3_malloc 00:20:23.458 17:14:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:23.716 true 00:20:23.716 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:23.974 [2024-07-23 17:14:19.278027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:23.974 [2024-07-23 17:14:19.278071] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.974 [2024-07-23 17:14:19.278093] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d6c00 00:20:23.974 [2024-07-23 17:14:19.278106] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.974 [2024-07-23 17:14:19.279677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.974 [2024-07-23 17:14:19.279705] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:23.974 BaseBdev3 00:20:23.974 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:20:24.232 [2024-07-23 17:14:19.522711] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:24.232 [2024-07-23 17:14:19.524195] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:24.232 [2024-07-23 17:14:19.524260] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:24.232 [2024-07-23 17:14:19.524456] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d8670 00:20:24.232 [2024-07-23 17:14:19.524467] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:20:24.232 [2024-07-23 17:14:19.524662] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1328810 00:20:24.232 [2024-07-23 17:14:19.524807] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d8670 00:20:24.232 [2024-07-23 17:14:19.524817] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d8670 00:20:24.232 [2024-07-23 17:14:19.524930] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.232 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.491 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.491 "name": "raid_bdev1", 00:20:24.491 "uuid": "8c820c8d-a750-46b6-82d4-eb5ac73bcb42", 00:20:24.491 "strip_size_kb": 64, 00:20:24.491 "state": "online", 00:20:24.491 "raid_level": "concat", 00:20:24.491 "superblock": true, 00:20:24.491 "num_base_bdevs": 3, 00:20:24.491 "num_base_bdevs_discovered": 3, 00:20:24.491 "num_base_bdevs_operational": 3, 00:20:24.491 "base_bdevs_list": [ 00:20:24.491 { 00:20:24.491 "name": "BaseBdev1", 00:20:24.491 "uuid": "8b999aec-b162-50f5-a74a-afa10b61ad3a", 00:20:24.491 "is_configured": true, 00:20:24.491 "data_offset": 2048, 00:20:24.491 "data_size": 63488 00:20:24.491 }, 00:20:24.491 { 00:20:24.491 "name": "BaseBdev2", 00:20:24.491 "uuid": "255c9351-ec3e-5c44-9f9a-aefb0e28072b", 00:20:24.491 "is_configured": true, 00:20:24.491 "data_offset": 2048, 00:20:24.491 "data_size": 63488 00:20:24.491 }, 00:20:24.491 { 00:20:24.491 "name": "BaseBdev3", 00:20:24.491 "uuid": "1c442f82-447f-5139-87b2-9fb360e83939", 00:20:24.491 "is_configured": true, 00:20:24.491 "data_offset": 2048, 00:20:24.491 "data_size": 63488 00:20:24.491 } 00:20:24.491 ] 00:20:24.491 }' 00:20:24.491 17:14:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.491 17:14:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.427 17:14:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:25.427 17:14:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:25.685 [2024-07-23 17:14:20.910663] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d9e20 00:20:26.621 17:14:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.879 "name": "raid_bdev1", 00:20:26.879 "uuid": "8c820c8d-a750-46b6-82d4-eb5ac73bcb42", 00:20:26.879 "strip_size_kb": 64, 00:20:26.879 "state": "online", 00:20:26.879 "raid_level": "concat", 00:20:26.879 "superblock": true, 00:20:26.879 "num_base_bdevs": 3, 00:20:26.879 "num_base_bdevs_discovered": 3, 00:20:26.879 "num_base_bdevs_operational": 3, 00:20:26.879 "base_bdevs_list": [ 00:20:26.879 { 00:20:26.879 "name": "BaseBdev1", 00:20:26.879 "uuid": "8b999aec-b162-50f5-a74a-afa10b61ad3a", 00:20:26.879 "is_configured": true, 00:20:26.879 "data_offset": 2048, 00:20:26.879 "data_size": 63488 00:20:26.879 }, 00:20:26.879 { 00:20:26.879 "name": "BaseBdev2", 00:20:26.879 "uuid": "255c9351-ec3e-5c44-9f9a-aefb0e28072b", 00:20:26.879 "is_configured": true, 00:20:26.879 "data_offset": 2048, 00:20:26.879 "data_size": 63488 00:20:26.879 }, 00:20:26.879 { 00:20:26.879 "name": "BaseBdev3", 00:20:26.879 "uuid": "1c442f82-447f-5139-87b2-9fb360e83939", 00:20:26.879 "is_configured": true, 00:20:26.879 "data_offset": 2048, 00:20:26.879 "data_size": 63488 00:20:26.879 } 00:20:26.879 ] 00:20:26.879 }' 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.879 17:14:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.445 17:14:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:27.703 [2024-07-23 17:14:23.033884] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:27.703 [2024-07-23 17:14:23.033931] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:27.703 [2024-07-23 17:14:23.037099] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:27.703 [2024-07-23 17:14:23.037132] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:27.703 [2024-07-23 17:14:23.037163] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:27.703 [2024-07-23 17:14:23.037174] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d8670 name raid_bdev1, state offline 00:20:27.703 0 00:20:27.703 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4154192 00:20:27.703 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4154192 ']' 00:20:27.703 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4154192 00:20:27.703 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:27.703 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:27.703 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4154192 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4154192' 00:20:27.961 killing process with pid 4154192 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4154192 00:20:27.961 [2024-07-23 17:14:23.139418] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4154192 00:20:27.961 [2024-07-23 17:14:23.161490] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.r0L3jfP6Md 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:27.961 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:20:28.220 00:20:28.220 real 0m7.652s 00:20:28.220 user 0m12.338s 00:20:28.220 sys 0m1.339s 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:28.220 17:14:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.220 ************************************ 00:20:28.220 END TEST raid_read_error_test 00:20:28.220 ************************************ 00:20:28.220 17:14:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:28.220 17:14:23 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:20:28.220 17:14:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:28.220 17:14:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:28.220 17:14:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:28.220 ************************************ 00:20:28.220 START TEST raid_write_error_test 00:20:28.220 ************************************ 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ix7skbDTmJ 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4155334 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4155334 /var/tmp/spdk-raid.sock 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4155334 ']' 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:28.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:28.220 17:14:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.220 [2024-07-23 17:14:23.559534] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:20:28.220 [2024-07-23 17:14:23.559606] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4155334 ] 00:20:28.479 [2024-07-23 17:14:23.692542] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:28.479 [2024-07-23 17:14:23.742739] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:28.479 [2024-07-23 17:14:23.806491] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:28.479 [2024-07-23 17:14:23.806537] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:29.047 17:14:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:29.047 17:14:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:29.047 17:14:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:29.047 17:14:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:29.306 BaseBdev1_malloc 00:20:29.306 17:14:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:29.306 true 00:20:29.565 17:14:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:29.565 [2024-07-23 17:14:24.966240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:29.565 [2024-07-23 17:14:24.966285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.565 [2024-07-23 17:14:24.966305] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed45c0 00:20:29.565 [2024-07-23 17:14:24.966317] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.565 [2024-07-23 17:14:24.968036] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.565 [2024-07-23 17:14:24.968066] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:29.565 BaseBdev1 00:20:29.824 17:14:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:29.824 17:14:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:29.824 BaseBdev2_malloc 00:20:29.824 17:14:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:30.084 true 00:20:30.084 17:14:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:30.372 [2024-07-23 17:14:25.508208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:30.372 [2024-07-23 17:14:25.508249] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.372 [2024-07-23 17:14:25.508269] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xece620 00:20:30.372 [2024-07-23 17:14:25.508281] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.372 [2024-07-23 17:14:25.509669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.372 [2024-07-23 17:14:25.509697] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:30.372 BaseBdev2 00:20:30.372 17:14:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:30.372 17:14:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:30.372 BaseBdev3_malloc 00:20:30.630 17:14:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:30.630 true 00:20:30.630 17:14:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:30.888 [2024-07-23 17:14:26.202620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:30.888 [2024-07-23 17:14:26.202663] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.888 [2024-07-23 17:14:26.202683] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecec00 00:20:30.888 [2024-07-23 17:14:26.202695] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.888 [2024-07-23 17:14:26.204132] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.888 [2024-07-23 17:14:26.204163] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:30.888 BaseBdev3 00:20:30.888 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:20:31.146 [2024-07-23 17:14:26.383137] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:31.146 [2024-07-23 17:14:26.384390] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:31.146 [2024-07-23 17:14:26.384452] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:31.146 [2024-07-23 17:14:26.384640] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xed0670 00:20:31.146 [2024-07-23 17:14:26.384652] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:20:31.146 [2024-07-23 17:14:26.384823] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd20810 00:20:31.146 [2024-07-23 17:14:26.384971] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed0670 00:20:31.146 [2024-07-23 17:14:26.384982] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed0670 00:20:31.146 [2024-07-23 17:14:26.385076] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.146 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.405 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.405 "name": "raid_bdev1", 00:20:31.405 "uuid": "fe36872e-5e11-4bca-8903-4fc678eeae2d", 00:20:31.405 "strip_size_kb": 64, 00:20:31.405 "state": "online", 00:20:31.405 "raid_level": "concat", 00:20:31.405 "superblock": true, 00:20:31.405 "num_base_bdevs": 3, 00:20:31.405 "num_base_bdevs_discovered": 3, 00:20:31.405 "num_base_bdevs_operational": 3, 00:20:31.405 "base_bdevs_list": [ 00:20:31.405 { 00:20:31.405 "name": "BaseBdev1", 00:20:31.405 "uuid": "16274733-018f-5a55-9983-8eb2f03d842a", 00:20:31.405 "is_configured": true, 00:20:31.405 "data_offset": 2048, 00:20:31.405 "data_size": 63488 00:20:31.405 }, 00:20:31.405 { 00:20:31.405 "name": "BaseBdev2", 00:20:31.405 "uuid": "8882f4a8-729c-59e5-bff9-d259cb08316a", 00:20:31.405 "is_configured": true, 00:20:31.405 "data_offset": 2048, 00:20:31.405 "data_size": 63488 00:20:31.405 }, 00:20:31.405 { 00:20:31.405 "name": "BaseBdev3", 00:20:31.405 "uuid": "f1b20737-6b6f-5a7c-b760-bd28c3f58360", 00:20:31.405 "is_configured": true, 00:20:31.405 "data_offset": 2048, 00:20:31.405 "data_size": 63488 00:20:31.405 } 00:20:31.405 ] 00:20:31.405 }' 00:20:31.405 17:14:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.405 17:14:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.086 17:14:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:32.086 17:14:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:32.086 [2024-07-23 17:14:27.281794] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xed1e20 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.022 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.281 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.281 "name": "raid_bdev1", 00:20:33.281 "uuid": "fe36872e-5e11-4bca-8903-4fc678eeae2d", 00:20:33.281 "strip_size_kb": 64, 00:20:33.281 "state": "online", 00:20:33.281 "raid_level": "concat", 00:20:33.281 "superblock": true, 00:20:33.281 "num_base_bdevs": 3, 00:20:33.281 "num_base_bdevs_discovered": 3, 00:20:33.281 "num_base_bdevs_operational": 3, 00:20:33.281 "base_bdevs_list": [ 00:20:33.281 { 00:20:33.281 "name": "BaseBdev1", 00:20:33.281 "uuid": "16274733-018f-5a55-9983-8eb2f03d842a", 00:20:33.281 "is_configured": true, 00:20:33.281 "data_offset": 2048, 00:20:33.281 "data_size": 63488 00:20:33.281 }, 00:20:33.281 { 00:20:33.281 "name": "BaseBdev2", 00:20:33.281 "uuid": "8882f4a8-729c-59e5-bff9-d259cb08316a", 00:20:33.281 "is_configured": true, 00:20:33.281 "data_offset": 2048, 00:20:33.281 "data_size": 63488 00:20:33.281 }, 00:20:33.281 { 00:20:33.281 "name": "BaseBdev3", 00:20:33.281 "uuid": "f1b20737-6b6f-5a7c-b760-bd28c3f58360", 00:20:33.281 "is_configured": true, 00:20:33.281 "data_offset": 2048, 00:20:33.281 "data_size": 63488 00:20:33.281 } 00:20:33.281 ] 00:20:33.281 }' 00:20:33.281 17:14:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.281 17:14:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.848 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:34.107 [2024-07-23 17:14:29.481962] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:34.107 [2024-07-23 17:14:29.482009] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:34.107 [2024-07-23 17:14:29.485190] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:34.107 [2024-07-23 17:14:29.485227] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:34.107 [2024-07-23 17:14:29.485259] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:34.107 [2024-07-23 17:14:29.485270] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed0670 name raid_bdev1, state offline 00:20:34.107 0 00:20:34.107 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4155334 00:20:34.107 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4155334 ']' 00:20:34.107 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4155334 00:20:34.107 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:34.107 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:34.107 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4155334 00:20:34.365 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:34.365 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:34.365 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4155334' 00:20:34.366 killing process with pid 4155334 00:20:34.366 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4155334 00:20:34.366 [2024-07-23 17:14:29.564384] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:34.366 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4155334 00:20:34.366 [2024-07-23 17:14:29.586245] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ix7skbDTmJ 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:20:34.625 00:20:34.625 real 0m6.336s 00:20:34.625 user 0m9.880s 00:20:34.625 sys 0m1.130s 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:34.625 17:14:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.625 ************************************ 00:20:34.625 END TEST raid_write_error_test 00:20:34.625 ************************************ 00:20:34.625 17:14:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:34.625 17:14:29 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:34.625 17:14:29 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:20:34.625 17:14:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:34.625 17:14:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:34.625 17:14:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:34.625 ************************************ 00:20:34.625 START TEST raid_state_function_test 00:20:34.625 ************************************ 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4156196 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4156196' 00:20:34.625 Process raid pid: 4156196 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4156196 /var/tmp/spdk-raid.sock 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4156196 ']' 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:34.625 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:34.625 17:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.625 [2024-07-23 17:14:29.968301] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:20:34.625 [2024-07-23 17:14:29.968368] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:34.884 [2024-07-23 17:14:30.110017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:34.884 [2024-07-23 17:14:30.162054] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:34.884 [2024-07-23 17:14:30.216891] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:34.884 [2024-07-23 17:14:30.216921] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:35.820 17:14:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:35.820 17:14:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:35.820 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:20:36.386 [2024-07-23 17:14:31.650994] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:36.386 [2024-07-23 17:14:31.651035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:36.386 [2024-07-23 17:14:31.651045] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:36.387 [2024-07-23 17:14:31.651057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:36.387 [2024-07-23 17:14:31.651066] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:36.387 [2024-07-23 17:14:31.651077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.387 17:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.953 17:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.953 "name": "Existed_Raid", 00:20:36.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.953 "strip_size_kb": 0, 00:20:36.953 "state": "configuring", 00:20:36.953 "raid_level": "raid1", 00:20:36.953 "superblock": false, 00:20:36.953 "num_base_bdevs": 3, 00:20:36.953 "num_base_bdevs_discovered": 0, 00:20:36.953 "num_base_bdevs_operational": 3, 00:20:36.953 "base_bdevs_list": [ 00:20:36.953 { 00:20:36.953 "name": "BaseBdev1", 00:20:36.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.953 "is_configured": false, 00:20:36.953 "data_offset": 0, 00:20:36.953 "data_size": 0 00:20:36.953 }, 00:20:36.953 { 00:20:36.953 "name": "BaseBdev2", 00:20:36.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.953 "is_configured": false, 00:20:36.953 "data_offset": 0, 00:20:36.953 "data_size": 0 00:20:36.953 }, 00:20:36.953 { 00:20:36.953 "name": "BaseBdev3", 00:20:36.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.953 "is_configured": false, 00:20:36.953 "data_offset": 0, 00:20:36.953 "data_size": 0 00:20:36.953 } 00:20:36.953 ] 00:20:36.953 }' 00:20:36.953 17:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.953 17:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:37.520 17:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:38.086 [2024-07-23 17:14:33.287153] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:38.086 [2024-07-23 17:14:33.287186] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x87c280 name Existed_Raid, state configuring 00:20:38.086 17:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:20:38.652 [2024-07-23 17:14:33.800493] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:38.652 [2024-07-23 17:14:33.800525] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:38.652 [2024-07-23 17:14:33.800535] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:38.652 [2024-07-23 17:14:33.800546] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:38.652 [2024-07-23 17:14:33.800555] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:38.652 [2024-07-23 17:14:33.800565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:38.652 17:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:38.911 [2024-07-23 17:14:34.324883] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:38.911 BaseBdev1 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:39.171 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:39.739 17:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:39.998 [ 00:20:39.998 { 00:20:39.998 "name": "BaseBdev1", 00:20:39.998 "aliases": [ 00:20:39.998 "a2446750-a0c4-458f-a20e-953929bea2eb" 00:20:39.998 ], 00:20:39.998 "product_name": "Malloc disk", 00:20:39.998 "block_size": 512, 00:20:39.998 "num_blocks": 65536, 00:20:39.998 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:39.998 "assigned_rate_limits": { 00:20:39.998 "rw_ios_per_sec": 0, 00:20:39.998 "rw_mbytes_per_sec": 0, 00:20:39.998 "r_mbytes_per_sec": 0, 00:20:39.998 "w_mbytes_per_sec": 0 00:20:39.998 }, 00:20:39.998 "claimed": true, 00:20:39.998 "claim_type": "exclusive_write", 00:20:39.998 "zoned": false, 00:20:39.998 "supported_io_types": { 00:20:39.998 "read": true, 00:20:39.998 "write": true, 00:20:39.998 "unmap": true, 00:20:39.998 "flush": true, 00:20:39.998 "reset": true, 00:20:39.998 "nvme_admin": false, 00:20:39.998 "nvme_io": false, 00:20:39.998 "nvme_io_md": false, 00:20:39.998 "write_zeroes": true, 00:20:39.998 "zcopy": true, 00:20:39.998 "get_zone_info": false, 00:20:39.998 "zone_management": false, 00:20:39.998 "zone_append": false, 00:20:39.998 "compare": false, 00:20:39.998 "compare_and_write": false, 00:20:39.998 "abort": true, 00:20:39.998 "seek_hole": false, 00:20:39.998 "seek_data": false, 00:20:39.998 "copy": true, 00:20:39.998 "nvme_iov_md": false 00:20:39.998 }, 00:20:39.998 "memory_domains": [ 00:20:39.998 { 00:20:39.998 "dma_device_id": "system", 00:20:39.998 "dma_device_type": 1 00:20:39.998 }, 00:20:39.998 { 00:20:39.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.998 "dma_device_type": 2 00:20:39.998 } 00:20:39.998 ], 00:20:39.998 "driver_specific": {} 00:20:39.998 } 00:20:39.998 ] 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.998 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.565 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.565 "name": "Existed_Raid", 00:20:40.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.565 "strip_size_kb": 0, 00:20:40.565 "state": "configuring", 00:20:40.565 "raid_level": "raid1", 00:20:40.565 "superblock": false, 00:20:40.565 "num_base_bdevs": 3, 00:20:40.565 "num_base_bdevs_discovered": 1, 00:20:40.565 "num_base_bdevs_operational": 3, 00:20:40.565 "base_bdevs_list": [ 00:20:40.565 { 00:20:40.565 "name": "BaseBdev1", 00:20:40.565 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:40.565 "is_configured": true, 00:20:40.565 "data_offset": 0, 00:20:40.565 "data_size": 65536 00:20:40.565 }, 00:20:40.565 { 00:20:40.565 "name": "BaseBdev2", 00:20:40.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.565 "is_configured": false, 00:20:40.565 "data_offset": 0, 00:20:40.565 "data_size": 0 00:20:40.565 }, 00:20:40.565 { 00:20:40.565 "name": "BaseBdev3", 00:20:40.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.565 "is_configured": false, 00:20:40.565 "data_offset": 0, 00:20:40.565 "data_size": 0 00:20:40.565 } 00:20:40.565 ] 00:20:40.565 }' 00:20:40.565 17:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.565 17:14:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:41.131 17:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:41.389 [2024-07-23 17:14:36.731269] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:41.389 [2024-07-23 17:14:36.731306] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x87bbb0 name Existed_Raid, state configuring 00:20:41.389 17:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:20:41.647 [2024-07-23 17:14:36.979966] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:41.647 [2024-07-23 17:14:36.981387] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:41.647 [2024-07-23 17:14:36.981418] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:41.647 [2024-07-23 17:14:36.981429] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:41.647 [2024-07-23 17:14:36.981440] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.647 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.905 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.905 "name": "Existed_Raid", 00:20:41.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.905 "strip_size_kb": 0, 00:20:41.905 "state": "configuring", 00:20:41.905 "raid_level": "raid1", 00:20:41.905 "superblock": false, 00:20:41.905 "num_base_bdevs": 3, 00:20:41.905 "num_base_bdevs_discovered": 1, 00:20:41.905 "num_base_bdevs_operational": 3, 00:20:41.905 "base_bdevs_list": [ 00:20:41.905 { 00:20:41.905 "name": "BaseBdev1", 00:20:41.905 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:41.905 "is_configured": true, 00:20:41.905 "data_offset": 0, 00:20:41.905 "data_size": 65536 00:20:41.905 }, 00:20:41.905 { 00:20:41.905 "name": "BaseBdev2", 00:20:41.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.905 "is_configured": false, 00:20:41.905 "data_offset": 0, 00:20:41.905 "data_size": 0 00:20:41.905 }, 00:20:41.905 { 00:20:41.905 "name": "BaseBdev3", 00:20:41.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.905 "is_configured": false, 00:20:41.905 "data_offset": 0, 00:20:41.905 "data_size": 0 00:20:41.905 } 00:20:41.905 ] 00:20:41.905 }' 00:20:41.905 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.905 17:14:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.471 17:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:42.729 [2024-07-23 17:14:38.082282] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:42.729 BaseBdev2 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:42.729 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:42.987 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:43.244 [ 00:20:43.244 { 00:20:43.244 "name": "BaseBdev2", 00:20:43.244 "aliases": [ 00:20:43.244 "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9" 00:20:43.244 ], 00:20:43.244 "product_name": "Malloc disk", 00:20:43.244 "block_size": 512, 00:20:43.244 "num_blocks": 65536, 00:20:43.244 "uuid": "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9", 00:20:43.244 "assigned_rate_limits": { 00:20:43.244 "rw_ios_per_sec": 0, 00:20:43.245 "rw_mbytes_per_sec": 0, 00:20:43.245 "r_mbytes_per_sec": 0, 00:20:43.245 "w_mbytes_per_sec": 0 00:20:43.245 }, 00:20:43.245 "claimed": true, 00:20:43.245 "claim_type": "exclusive_write", 00:20:43.245 "zoned": false, 00:20:43.245 "supported_io_types": { 00:20:43.245 "read": true, 00:20:43.245 "write": true, 00:20:43.245 "unmap": true, 00:20:43.245 "flush": true, 00:20:43.245 "reset": true, 00:20:43.245 "nvme_admin": false, 00:20:43.245 "nvme_io": false, 00:20:43.245 "nvme_io_md": false, 00:20:43.245 "write_zeroes": true, 00:20:43.245 "zcopy": true, 00:20:43.245 "get_zone_info": false, 00:20:43.245 "zone_management": false, 00:20:43.245 "zone_append": false, 00:20:43.245 "compare": false, 00:20:43.245 "compare_and_write": false, 00:20:43.245 "abort": true, 00:20:43.245 "seek_hole": false, 00:20:43.245 "seek_data": false, 00:20:43.245 "copy": true, 00:20:43.245 "nvme_iov_md": false 00:20:43.245 }, 00:20:43.245 "memory_domains": [ 00:20:43.245 { 00:20:43.245 "dma_device_id": "system", 00:20:43.245 "dma_device_type": 1 00:20:43.245 }, 00:20:43.245 { 00:20:43.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.245 "dma_device_type": 2 00:20:43.245 } 00:20:43.245 ], 00:20:43.245 "driver_specific": {} 00:20:43.245 } 00:20:43.245 ] 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.245 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:43.503 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:43.503 "name": "Existed_Raid", 00:20:43.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.503 "strip_size_kb": 0, 00:20:43.503 "state": "configuring", 00:20:43.503 "raid_level": "raid1", 00:20:43.503 "superblock": false, 00:20:43.503 "num_base_bdevs": 3, 00:20:43.503 "num_base_bdevs_discovered": 2, 00:20:43.503 "num_base_bdevs_operational": 3, 00:20:43.503 "base_bdevs_list": [ 00:20:43.503 { 00:20:43.503 "name": "BaseBdev1", 00:20:43.503 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:43.503 "is_configured": true, 00:20:43.503 "data_offset": 0, 00:20:43.503 "data_size": 65536 00:20:43.503 }, 00:20:43.503 { 00:20:43.503 "name": "BaseBdev2", 00:20:43.503 "uuid": "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9", 00:20:43.503 "is_configured": true, 00:20:43.503 "data_offset": 0, 00:20:43.503 "data_size": 65536 00:20:43.503 }, 00:20:43.503 { 00:20:43.503 "name": "BaseBdev3", 00:20:43.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.503 "is_configured": false, 00:20:43.503 "data_offset": 0, 00:20:43.503 "data_size": 0 00:20:43.503 } 00:20:43.503 ] 00:20:43.503 }' 00:20:43.503 17:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:43.503 17:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:44.069 17:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:44.327 [2024-07-23 17:14:39.705973] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:44.327 [2024-07-23 17:14:39.706011] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x87b800 00:20:44.327 [2024-07-23 17:14:39.706020] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:44.327 [2024-07-23 17:14:39.706268] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x87fb50 00:20:44.327 [2024-07-23 17:14:39.706388] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x87b800 00:20:44.327 [2024-07-23 17:14:39.706398] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x87b800 00:20:44.327 [2024-07-23 17:14:39.706551] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.327 BaseBdev3 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:44.327 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:44.585 17:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:44.843 [ 00:20:44.843 { 00:20:44.843 "name": "BaseBdev3", 00:20:44.843 "aliases": [ 00:20:44.843 "c93b2670-657b-4b09-9b5a-bf72270523cb" 00:20:44.843 ], 00:20:44.843 "product_name": "Malloc disk", 00:20:44.843 "block_size": 512, 00:20:44.843 "num_blocks": 65536, 00:20:44.843 "uuid": "c93b2670-657b-4b09-9b5a-bf72270523cb", 00:20:44.843 "assigned_rate_limits": { 00:20:44.843 "rw_ios_per_sec": 0, 00:20:44.843 "rw_mbytes_per_sec": 0, 00:20:44.843 "r_mbytes_per_sec": 0, 00:20:44.843 "w_mbytes_per_sec": 0 00:20:44.843 }, 00:20:44.843 "claimed": true, 00:20:44.843 "claim_type": "exclusive_write", 00:20:44.843 "zoned": false, 00:20:44.843 "supported_io_types": { 00:20:44.843 "read": true, 00:20:44.843 "write": true, 00:20:44.843 "unmap": true, 00:20:44.843 "flush": true, 00:20:44.843 "reset": true, 00:20:44.843 "nvme_admin": false, 00:20:44.843 "nvme_io": false, 00:20:44.843 "nvme_io_md": false, 00:20:44.843 "write_zeroes": true, 00:20:44.843 "zcopy": true, 00:20:44.843 "get_zone_info": false, 00:20:44.843 "zone_management": false, 00:20:44.843 "zone_append": false, 00:20:44.843 "compare": false, 00:20:44.843 "compare_and_write": false, 00:20:44.843 "abort": true, 00:20:44.843 "seek_hole": false, 00:20:44.843 "seek_data": false, 00:20:44.843 "copy": true, 00:20:44.843 "nvme_iov_md": false 00:20:44.843 }, 00:20:44.843 "memory_domains": [ 00:20:44.843 { 00:20:44.843 "dma_device_id": "system", 00:20:44.843 "dma_device_type": 1 00:20:44.843 }, 00:20:44.843 { 00:20:44.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.843 "dma_device_type": 2 00:20:44.843 } 00:20:44.843 ], 00:20:44.843 "driver_specific": {} 00:20:44.843 } 00:20:44.843 ] 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.843 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.101 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.101 "name": "Existed_Raid", 00:20:45.101 "uuid": "84c25375-d923-44e6-98c3-2556201d0873", 00:20:45.101 "strip_size_kb": 0, 00:20:45.101 "state": "online", 00:20:45.101 "raid_level": "raid1", 00:20:45.101 "superblock": false, 00:20:45.101 "num_base_bdevs": 3, 00:20:45.101 "num_base_bdevs_discovered": 3, 00:20:45.101 "num_base_bdevs_operational": 3, 00:20:45.101 "base_bdevs_list": [ 00:20:45.101 { 00:20:45.101 "name": "BaseBdev1", 00:20:45.102 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:45.102 "is_configured": true, 00:20:45.102 "data_offset": 0, 00:20:45.102 "data_size": 65536 00:20:45.102 }, 00:20:45.102 { 00:20:45.102 "name": "BaseBdev2", 00:20:45.102 "uuid": "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9", 00:20:45.102 "is_configured": true, 00:20:45.102 "data_offset": 0, 00:20:45.102 "data_size": 65536 00:20:45.102 }, 00:20:45.102 { 00:20:45.102 "name": "BaseBdev3", 00:20:45.102 "uuid": "c93b2670-657b-4b09-9b5a-bf72270523cb", 00:20:45.102 "is_configured": true, 00:20:45.102 "data_offset": 0, 00:20:45.102 "data_size": 65536 00:20:45.102 } 00:20:45.102 ] 00:20:45.102 }' 00:20:45.102 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.102 17:14:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:45.667 17:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:45.925 [2024-07-23 17:14:41.186192] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:45.925 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:45.925 "name": "Existed_Raid", 00:20:45.925 "aliases": [ 00:20:45.925 "84c25375-d923-44e6-98c3-2556201d0873" 00:20:45.925 ], 00:20:45.925 "product_name": "Raid Volume", 00:20:45.925 "block_size": 512, 00:20:45.925 "num_blocks": 65536, 00:20:45.925 "uuid": "84c25375-d923-44e6-98c3-2556201d0873", 00:20:45.925 "assigned_rate_limits": { 00:20:45.925 "rw_ios_per_sec": 0, 00:20:45.925 "rw_mbytes_per_sec": 0, 00:20:45.925 "r_mbytes_per_sec": 0, 00:20:45.925 "w_mbytes_per_sec": 0 00:20:45.925 }, 00:20:45.925 "claimed": false, 00:20:45.925 "zoned": false, 00:20:45.925 "supported_io_types": { 00:20:45.925 "read": true, 00:20:45.925 "write": true, 00:20:45.925 "unmap": false, 00:20:45.925 "flush": false, 00:20:45.925 "reset": true, 00:20:45.925 "nvme_admin": false, 00:20:45.925 "nvme_io": false, 00:20:45.925 "nvme_io_md": false, 00:20:45.925 "write_zeroes": true, 00:20:45.925 "zcopy": false, 00:20:45.925 "get_zone_info": false, 00:20:45.925 "zone_management": false, 00:20:45.925 "zone_append": false, 00:20:45.925 "compare": false, 00:20:45.925 "compare_and_write": false, 00:20:45.925 "abort": false, 00:20:45.925 "seek_hole": false, 00:20:45.925 "seek_data": false, 00:20:45.925 "copy": false, 00:20:45.925 "nvme_iov_md": false 00:20:45.925 }, 00:20:45.925 "memory_domains": [ 00:20:45.925 { 00:20:45.925 "dma_device_id": "system", 00:20:45.925 "dma_device_type": 1 00:20:45.925 }, 00:20:45.925 { 00:20:45.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.925 "dma_device_type": 2 00:20:45.925 }, 00:20:45.925 { 00:20:45.925 "dma_device_id": "system", 00:20:45.925 "dma_device_type": 1 00:20:45.925 }, 00:20:45.925 { 00:20:45.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.925 "dma_device_type": 2 00:20:45.925 }, 00:20:45.925 { 00:20:45.925 "dma_device_id": "system", 00:20:45.925 "dma_device_type": 1 00:20:45.925 }, 00:20:45.925 { 00:20:45.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.925 "dma_device_type": 2 00:20:45.925 } 00:20:45.925 ], 00:20:45.925 "driver_specific": { 00:20:45.925 "raid": { 00:20:45.925 "uuid": "84c25375-d923-44e6-98c3-2556201d0873", 00:20:45.925 "strip_size_kb": 0, 00:20:45.926 "state": "online", 00:20:45.926 "raid_level": "raid1", 00:20:45.926 "superblock": false, 00:20:45.926 "num_base_bdevs": 3, 00:20:45.926 "num_base_bdevs_discovered": 3, 00:20:45.926 "num_base_bdevs_operational": 3, 00:20:45.926 "base_bdevs_list": [ 00:20:45.926 { 00:20:45.926 "name": "BaseBdev1", 00:20:45.926 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:45.926 "is_configured": true, 00:20:45.926 "data_offset": 0, 00:20:45.926 "data_size": 65536 00:20:45.926 }, 00:20:45.926 { 00:20:45.926 "name": "BaseBdev2", 00:20:45.926 "uuid": "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9", 00:20:45.926 "is_configured": true, 00:20:45.926 "data_offset": 0, 00:20:45.926 "data_size": 65536 00:20:45.926 }, 00:20:45.926 { 00:20:45.926 "name": "BaseBdev3", 00:20:45.926 "uuid": "c93b2670-657b-4b09-9b5a-bf72270523cb", 00:20:45.926 "is_configured": true, 00:20:45.926 "data_offset": 0, 00:20:45.926 "data_size": 65536 00:20:45.926 } 00:20:45.926 ] 00:20:45.926 } 00:20:45.926 } 00:20:45.926 }' 00:20:45.926 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:45.926 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:45.926 BaseBdev2 00:20:45.926 BaseBdev3' 00:20:45.926 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:45.926 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:45.926 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:46.183 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:46.183 "name": "BaseBdev1", 00:20:46.183 "aliases": [ 00:20:46.183 "a2446750-a0c4-458f-a20e-953929bea2eb" 00:20:46.183 ], 00:20:46.183 "product_name": "Malloc disk", 00:20:46.183 "block_size": 512, 00:20:46.183 "num_blocks": 65536, 00:20:46.183 "uuid": "a2446750-a0c4-458f-a20e-953929bea2eb", 00:20:46.183 "assigned_rate_limits": { 00:20:46.183 "rw_ios_per_sec": 0, 00:20:46.183 "rw_mbytes_per_sec": 0, 00:20:46.183 "r_mbytes_per_sec": 0, 00:20:46.183 "w_mbytes_per_sec": 0 00:20:46.183 }, 00:20:46.183 "claimed": true, 00:20:46.184 "claim_type": "exclusive_write", 00:20:46.184 "zoned": false, 00:20:46.184 "supported_io_types": { 00:20:46.184 "read": true, 00:20:46.184 "write": true, 00:20:46.184 "unmap": true, 00:20:46.184 "flush": true, 00:20:46.184 "reset": true, 00:20:46.184 "nvme_admin": false, 00:20:46.184 "nvme_io": false, 00:20:46.184 "nvme_io_md": false, 00:20:46.184 "write_zeroes": true, 00:20:46.184 "zcopy": true, 00:20:46.184 "get_zone_info": false, 00:20:46.184 "zone_management": false, 00:20:46.184 "zone_append": false, 00:20:46.184 "compare": false, 00:20:46.184 "compare_and_write": false, 00:20:46.184 "abort": true, 00:20:46.184 "seek_hole": false, 00:20:46.184 "seek_data": false, 00:20:46.184 "copy": true, 00:20:46.184 "nvme_iov_md": false 00:20:46.184 }, 00:20:46.184 "memory_domains": [ 00:20:46.184 { 00:20:46.184 "dma_device_id": "system", 00:20:46.184 "dma_device_type": 1 00:20:46.184 }, 00:20:46.184 { 00:20:46.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.184 "dma_device_type": 2 00:20:46.184 } 00:20:46.184 ], 00:20:46.184 "driver_specific": {} 00:20:46.184 }' 00:20:46.184 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.184 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.184 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:46.184 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:46.442 17:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:46.699 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:46.699 "name": "BaseBdev2", 00:20:46.699 "aliases": [ 00:20:46.699 "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9" 00:20:46.699 ], 00:20:46.699 "product_name": "Malloc disk", 00:20:46.699 "block_size": 512, 00:20:46.699 "num_blocks": 65536, 00:20:46.699 "uuid": "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9", 00:20:46.699 "assigned_rate_limits": { 00:20:46.699 "rw_ios_per_sec": 0, 00:20:46.699 "rw_mbytes_per_sec": 0, 00:20:46.699 "r_mbytes_per_sec": 0, 00:20:46.699 "w_mbytes_per_sec": 0 00:20:46.699 }, 00:20:46.699 "claimed": true, 00:20:46.699 "claim_type": "exclusive_write", 00:20:46.699 "zoned": false, 00:20:46.699 "supported_io_types": { 00:20:46.699 "read": true, 00:20:46.699 "write": true, 00:20:46.699 "unmap": true, 00:20:46.699 "flush": true, 00:20:46.699 "reset": true, 00:20:46.699 "nvme_admin": false, 00:20:46.699 "nvme_io": false, 00:20:46.699 "nvme_io_md": false, 00:20:46.699 "write_zeroes": true, 00:20:46.699 "zcopy": true, 00:20:46.699 "get_zone_info": false, 00:20:46.699 "zone_management": false, 00:20:46.699 "zone_append": false, 00:20:46.699 "compare": false, 00:20:46.699 "compare_and_write": false, 00:20:46.699 "abort": true, 00:20:46.699 "seek_hole": false, 00:20:46.699 "seek_data": false, 00:20:46.699 "copy": true, 00:20:46.699 "nvme_iov_md": false 00:20:46.699 }, 00:20:46.699 "memory_domains": [ 00:20:46.699 { 00:20:46.699 "dma_device_id": "system", 00:20:46.699 "dma_device_type": 1 00:20:46.699 }, 00:20:46.699 { 00:20:46.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.699 "dma_device_type": 2 00:20:46.699 } 00:20:46.699 ], 00:20:46.699 "driver_specific": {} 00:20:46.699 }' 00:20:46.699 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:46.956 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.213 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.213 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.213 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.213 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:47.213 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:47.471 "name": "BaseBdev3", 00:20:47.471 "aliases": [ 00:20:47.471 "c93b2670-657b-4b09-9b5a-bf72270523cb" 00:20:47.471 ], 00:20:47.471 "product_name": "Malloc disk", 00:20:47.471 "block_size": 512, 00:20:47.471 "num_blocks": 65536, 00:20:47.471 "uuid": "c93b2670-657b-4b09-9b5a-bf72270523cb", 00:20:47.471 "assigned_rate_limits": { 00:20:47.471 "rw_ios_per_sec": 0, 00:20:47.471 "rw_mbytes_per_sec": 0, 00:20:47.471 "r_mbytes_per_sec": 0, 00:20:47.471 "w_mbytes_per_sec": 0 00:20:47.471 }, 00:20:47.471 "claimed": true, 00:20:47.471 "claim_type": "exclusive_write", 00:20:47.471 "zoned": false, 00:20:47.471 "supported_io_types": { 00:20:47.471 "read": true, 00:20:47.471 "write": true, 00:20:47.471 "unmap": true, 00:20:47.471 "flush": true, 00:20:47.471 "reset": true, 00:20:47.471 "nvme_admin": false, 00:20:47.471 "nvme_io": false, 00:20:47.471 "nvme_io_md": false, 00:20:47.471 "write_zeroes": true, 00:20:47.471 "zcopy": true, 00:20:47.471 "get_zone_info": false, 00:20:47.471 "zone_management": false, 00:20:47.471 "zone_append": false, 00:20:47.471 "compare": false, 00:20:47.471 "compare_and_write": false, 00:20:47.471 "abort": true, 00:20:47.471 "seek_hole": false, 00:20:47.471 "seek_data": false, 00:20:47.471 "copy": true, 00:20:47.471 "nvme_iov_md": false 00:20:47.471 }, 00:20:47.471 "memory_domains": [ 00:20:47.471 { 00:20:47.471 "dma_device_id": "system", 00:20:47.471 "dma_device_type": 1 00:20:47.471 }, 00:20:47.471 { 00:20:47.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.471 "dma_device_type": 2 00:20:47.471 } 00:20:47.471 ], 00:20:47.471 "driver_specific": {} 00:20:47.471 }' 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:47.471 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.729 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.729 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:47.729 17:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.729 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.729 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.729 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:47.987 [2024-07-23 17:14:43.267520] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.987 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.245 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.245 "name": "Existed_Raid", 00:20:48.245 "uuid": "84c25375-d923-44e6-98c3-2556201d0873", 00:20:48.245 "strip_size_kb": 0, 00:20:48.245 "state": "online", 00:20:48.245 "raid_level": "raid1", 00:20:48.245 "superblock": false, 00:20:48.245 "num_base_bdevs": 3, 00:20:48.245 "num_base_bdevs_discovered": 2, 00:20:48.245 "num_base_bdevs_operational": 2, 00:20:48.245 "base_bdevs_list": [ 00:20:48.245 { 00:20:48.245 "name": null, 00:20:48.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.245 "is_configured": false, 00:20:48.245 "data_offset": 0, 00:20:48.245 "data_size": 65536 00:20:48.245 }, 00:20:48.245 { 00:20:48.245 "name": "BaseBdev2", 00:20:48.245 "uuid": "ee38142f-bc7c-4ddf-b2de-d1d40df36eb9", 00:20:48.245 "is_configured": true, 00:20:48.245 "data_offset": 0, 00:20:48.245 "data_size": 65536 00:20:48.245 }, 00:20:48.245 { 00:20:48.245 "name": "BaseBdev3", 00:20:48.245 "uuid": "c93b2670-657b-4b09-9b5a-bf72270523cb", 00:20:48.245 "is_configured": true, 00:20:48.245 "data_offset": 0, 00:20:48.245 "data_size": 65536 00:20:48.245 } 00:20:48.245 ] 00:20:48.245 }' 00:20:48.245 17:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.245 17:14:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:48.811 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:48.811 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:48.812 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.812 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:49.070 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:49.070 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:49.070 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:49.328 [2024-07-23 17:14:44.680285] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:49.328 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:49.328 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.328 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.328 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:49.586 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:49.586 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:49.586 17:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:49.844 [2024-07-23 17:14:45.178024] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:49.844 [2024-07-23 17:14:45.178101] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:49.844 [2024-07-23 17:14:45.190737] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:49.844 [2024-07-23 17:14:45.190772] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:49.844 [2024-07-23 17:14:45.190783] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x87b800 name Existed_Raid, state offline 00:20:49.844 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:49.844 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.844 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.844 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:50.102 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:50.102 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:50.102 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:20:50.102 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:50.102 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:50.102 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:50.668 BaseBdev2 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:50.668 17:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:50.927 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:51.493 [ 00:20:51.493 { 00:20:51.493 "name": "BaseBdev2", 00:20:51.493 "aliases": [ 00:20:51.493 "fae69e27-4f7a-4a31-b621-1625a869d5c7" 00:20:51.493 ], 00:20:51.493 "product_name": "Malloc disk", 00:20:51.493 "block_size": 512, 00:20:51.493 "num_blocks": 65536, 00:20:51.493 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:20:51.493 "assigned_rate_limits": { 00:20:51.493 "rw_ios_per_sec": 0, 00:20:51.493 "rw_mbytes_per_sec": 0, 00:20:51.493 "r_mbytes_per_sec": 0, 00:20:51.493 "w_mbytes_per_sec": 0 00:20:51.493 }, 00:20:51.493 "claimed": false, 00:20:51.493 "zoned": false, 00:20:51.493 "supported_io_types": { 00:20:51.493 "read": true, 00:20:51.493 "write": true, 00:20:51.493 "unmap": true, 00:20:51.493 "flush": true, 00:20:51.493 "reset": true, 00:20:51.493 "nvme_admin": false, 00:20:51.493 "nvme_io": false, 00:20:51.493 "nvme_io_md": false, 00:20:51.493 "write_zeroes": true, 00:20:51.493 "zcopy": true, 00:20:51.493 "get_zone_info": false, 00:20:51.493 "zone_management": false, 00:20:51.493 "zone_append": false, 00:20:51.493 "compare": false, 00:20:51.493 "compare_and_write": false, 00:20:51.493 "abort": true, 00:20:51.493 "seek_hole": false, 00:20:51.493 "seek_data": false, 00:20:51.493 "copy": true, 00:20:51.493 "nvme_iov_md": false 00:20:51.493 }, 00:20:51.493 "memory_domains": [ 00:20:51.493 { 00:20:51.493 "dma_device_id": "system", 00:20:51.493 "dma_device_type": 1 00:20:51.493 }, 00:20:51.493 { 00:20:51.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.493 "dma_device_type": 2 00:20:51.493 } 00:20:51.493 ], 00:20:51.493 "driver_specific": {} 00:20:51.493 } 00:20:51.493 ] 00:20:51.493 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:51.493 17:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:51.493 17:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:51.493 17:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:51.493 BaseBdev3 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:51.752 17:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.752 17:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:52.320 [ 00:20:52.320 { 00:20:52.320 "name": "BaseBdev3", 00:20:52.320 "aliases": [ 00:20:52.320 "aad3f898-dee5-4ce9-9a27-f10c1b59f987" 00:20:52.320 ], 00:20:52.320 "product_name": "Malloc disk", 00:20:52.320 "block_size": 512, 00:20:52.320 "num_blocks": 65536, 00:20:52.320 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:20:52.320 "assigned_rate_limits": { 00:20:52.320 "rw_ios_per_sec": 0, 00:20:52.320 "rw_mbytes_per_sec": 0, 00:20:52.320 "r_mbytes_per_sec": 0, 00:20:52.320 "w_mbytes_per_sec": 0 00:20:52.320 }, 00:20:52.320 "claimed": false, 00:20:52.320 "zoned": false, 00:20:52.320 "supported_io_types": { 00:20:52.320 "read": true, 00:20:52.320 "write": true, 00:20:52.320 "unmap": true, 00:20:52.320 "flush": true, 00:20:52.320 "reset": true, 00:20:52.320 "nvme_admin": false, 00:20:52.320 "nvme_io": false, 00:20:52.320 "nvme_io_md": false, 00:20:52.320 "write_zeroes": true, 00:20:52.320 "zcopy": true, 00:20:52.320 "get_zone_info": false, 00:20:52.320 "zone_management": false, 00:20:52.320 "zone_append": false, 00:20:52.320 "compare": false, 00:20:52.320 "compare_and_write": false, 00:20:52.320 "abort": true, 00:20:52.320 "seek_hole": false, 00:20:52.320 "seek_data": false, 00:20:52.320 "copy": true, 00:20:52.320 "nvme_iov_md": false 00:20:52.320 }, 00:20:52.320 "memory_domains": [ 00:20:52.320 { 00:20:52.320 "dma_device_id": "system", 00:20:52.320 "dma_device_type": 1 00:20:52.320 }, 00:20:52.320 { 00:20:52.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.320 "dma_device_type": 2 00:20:52.320 } 00:20:52.320 ], 00:20:52.320 "driver_specific": {} 00:20:52.320 } 00:20:52.320 ] 00:20:52.320 17:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:52.320 17:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:52.320 17:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:52.320 17:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:20:52.888 [2024-07-23 17:14:48.156106] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:52.888 [2024-07-23 17:14:48.156153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:52.888 [2024-07-23 17:14:48.156172] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:52.888 [2024-07-23 17:14:48.157508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.888 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.146 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.146 "name": "Existed_Raid", 00:20:53.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.146 "strip_size_kb": 0, 00:20:53.146 "state": "configuring", 00:20:53.146 "raid_level": "raid1", 00:20:53.146 "superblock": false, 00:20:53.146 "num_base_bdevs": 3, 00:20:53.146 "num_base_bdevs_discovered": 2, 00:20:53.146 "num_base_bdevs_operational": 3, 00:20:53.146 "base_bdevs_list": [ 00:20:53.146 { 00:20:53.146 "name": "BaseBdev1", 00:20:53.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.146 "is_configured": false, 00:20:53.146 "data_offset": 0, 00:20:53.146 "data_size": 0 00:20:53.146 }, 00:20:53.146 { 00:20:53.146 "name": "BaseBdev2", 00:20:53.146 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:20:53.146 "is_configured": true, 00:20:53.146 "data_offset": 0, 00:20:53.146 "data_size": 65536 00:20:53.146 }, 00:20:53.146 { 00:20:53.146 "name": "BaseBdev3", 00:20:53.146 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:20:53.146 "is_configured": true, 00:20:53.146 "data_offset": 0, 00:20:53.146 "data_size": 65536 00:20:53.146 } 00:20:53.146 ] 00:20:53.146 }' 00:20:53.146 17:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.146 17:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.714 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:53.973 [2024-07-23 17:14:49.246972] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.973 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.232 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.232 "name": "Existed_Raid", 00:20:54.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.232 "strip_size_kb": 0, 00:20:54.232 "state": "configuring", 00:20:54.232 "raid_level": "raid1", 00:20:54.232 "superblock": false, 00:20:54.232 "num_base_bdevs": 3, 00:20:54.232 "num_base_bdevs_discovered": 1, 00:20:54.232 "num_base_bdevs_operational": 3, 00:20:54.232 "base_bdevs_list": [ 00:20:54.232 { 00:20:54.232 "name": "BaseBdev1", 00:20:54.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.232 "is_configured": false, 00:20:54.232 "data_offset": 0, 00:20:54.232 "data_size": 0 00:20:54.232 }, 00:20:54.232 { 00:20:54.232 "name": null, 00:20:54.232 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:20:54.232 "is_configured": false, 00:20:54.232 "data_offset": 0, 00:20:54.232 "data_size": 65536 00:20:54.232 }, 00:20:54.232 { 00:20:54.232 "name": "BaseBdev3", 00:20:54.232 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:20:54.232 "is_configured": true, 00:20:54.232 "data_offset": 0, 00:20:54.232 "data_size": 65536 00:20:54.232 } 00:20:54.232 ] 00:20:54.232 }' 00:20:54.232 17:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.232 17:14:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:54.799 17:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:54.799 17:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.057 17:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:55.057 17:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:55.625 [2024-07-23 17:14:50.870481] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:55.625 BaseBdev1 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:55.625 17:14:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.195 17:14:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:56.763 [ 00:20:56.763 { 00:20:56.763 "name": "BaseBdev1", 00:20:56.763 "aliases": [ 00:20:56.763 "34c94efe-78b0-4b85-8776-b1cc6723f769" 00:20:56.763 ], 00:20:56.763 "product_name": "Malloc disk", 00:20:56.763 "block_size": 512, 00:20:56.763 "num_blocks": 65536, 00:20:56.763 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:20:56.763 "assigned_rate_limits": { 00:20:56.763 "rw_ios_per_sec": 0, 00:20:56.763 "rw_mbytes_per_sec": 0, 00:20:56.763 "r_mbytes_per_sec": 0, 00:20:56.763 "w_mbytes_per_sec": 0 00:20:56.763 }, 00:20:56.763 "claimed": true, 00:20:56.763 "claim_type": "exclusive_write", 00:20:56.763 "zoned": false, 00:20:56.763 "supported_io_types": { 00:20:56.763 "read": true, 00:20:56.763 "write": true, 00:20:56.763 "unmap": true, 00:20:56.763 "flush": true, 00:20:56.763 "reset": true, 00:20:56.763 "nvme_admin": false, 00:20:56.763 "nvme_io": false, 00:20:56.763 "nvme_io_md": false, 00:20:56.763 "write_zeroes": true, 00:20:56.763 "zcopy": true, 00:20:56.763 "get_zone_info": false, 00:20:56.763 "zone_management": false, 00:20:56.763 "zone_append": false, 00:20:56.763 "compare": false, 00:20:56.763 "compare_and_write": false, 00:20:56.763 "abort": true, 00:20:56.763 "seek_hole": false, 00:20:56.763 "seek_data": false, 00:20:56.763 "copy": true, 00:20:56.763 "nvme_iov_md": false 00:20:56.763 }, 00:20:56.763 "memory_domains": [ 00:20:56.763 { 00:20:56.763 "dma_device_id": "system", 00:20:56.763 "dma_device_type": 1 00:20:56.763 }, 00:20:56.763 { 00:20:56.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.763 "dma_device_type": 2 00:20:56.763 } 00:20:56.763 ], 00:20:56.763 "driver_specific": {} 00:20:56.763 } 00:20:56.763 ] 00:20:56.763 17:14:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:56.763 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:56.763 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.764 17:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.023 17:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.023 "name": "Existed_Raid", 00:20:57.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.023 "strip_size_kb": 0, 00:20:57.023 "state": "configuring", 00:20:57.023 "raid_level": "raid1", 00:20:57.023 "superblock": false, 00:20:57.023 "num_base_bdevs": 3, 00:20:57.023 "num_base_bdevs_discovered": 2, 00:20:57.023 "num_base_bdevs_operational": 3, 00:20:57.023 "base_bdevs_list": [ 00:20:57.023 { 00:20:57.023 "name": "BaseBdev1", 00:20:57.023 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:20:57.023 "is_configured": true, 00:20:57.023 "data_offset": 0, 00:20:57.023 "data_size": 65536 00:20:57.023 }, 00:20:57.023 { 00:20:57.023 "name": null, 00:20:57.023 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:20:57.023 "is_configured": false, 00:20:57.023 "data_offset": 0, 00:20:57.023 "data_size": 65536 00:20:57.023 }, 00:20:57.023 { 00:20:57.023 "name": "BaseBdev3", 00:20:57.023 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:20:57.023 "is_configured": true, 00:20:57.023 "data_offset": 0, 00:20:57.023 "data_size": 65536 00:20:57.023 } 00:20:57.023 ] 00:20:57.023 }' 00:20:57.023 17:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.023 17:14:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:57.591 17:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.591 17:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:57.850 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:57.850 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:58.110 [2024-07-23 17:14:53.288932] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.111 "name": "Existed_Raid", 00:20:58.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.111 "strip_size_kb": 0, 00:20:58.111 "state": "configuring", 00:20:58.111 "raid_level": "raid1", 00:20:58.111 "superblock": false, 00:20:58.111 "num_base_bdevs": 3, 00:20:58.111 "num_base_bdevs_discovered": 1, 00:20:58.111 "num_base_bdevs_operational": 3, 00:20:58.111 "base_bdevs_list": [ 00:20:58.111 { 00:20:58.111 "name": "BaseBdev1", 00:20:58.111 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:20:58.111 "is_configured": true, 00:20:58.111 "data_offset": 0, 00:20:58.111 "data_size": 65536 00:20:58.111 }, 00:20:58.111 { 00:20:58.111 "name": null, 00:20:58.111 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:20:58.111 "is_configured": false, 00:20:58.111 "data_offset": 0, 00:20:58.111 "data_size": 65536 00:20:58.111 }, 00:20:58.111 { 00:20:58.111 "name": null, 00:20:58.111 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:20:58.111 "is_configured": false, 00:20:58.111 "data_offset": 0, 00:20:58.111 "data_size": 65536 00:20:58.111 } 00:20:58.111 ] 00:20:58.111 }' 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.111 17:14:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:58.678 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.678 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:58.938 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:58.938 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:59.297 [2024-07-23 17:14:54.456042] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.298 "name": "Existed_Raid", 00:20:59.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.298 "strip_size_kb": 0, 00:20:59.298 "state": "configuring", 00:20:59.298 "raid_level": "raid1", 00:20:59.298 "superblock": false, 00:20:59.298 "num_base_bdevs": 3, 00:20:59.298 "num_base_bdevs_discovered": 2, 00:20:59.298 "num_base_bdevs_operational": 3, 00:20:59.298 "base_bdevs_list": [ 00:20:59.298 { 00:20:59.298 "name": "BaseBdev1", 00:20:59.298 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:20:59.298 "is_configured": true, 00:20:59.298 "data_offset": 0, 00:20:59.298 "data_size": 65536 00:20:59.298 }, 00:20:59.298 { 00:20:59.298 "name": null, 00:20:59.298 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:20:59.298 "is_configured": false, 00:20:59.298 "data_offset": 0, 00:20:59.298 "data_size": 65536 00:20:59.298 }, 00:20:59.298 { 00:20:59.298 "name": "BaseBdev3", 00:20:59.298 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:20:59.298 "is_configured": true, 00:20:59.298 "data_offset": 0, 00:20:59.298 "data_size": 65536 00:20:59.298 } 00:20:59.298 ] 00:20:59.298 }' 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.298 17:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:59.865 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.865 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:00.123 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:00.123 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:00.381 [2024-07-23 17:14:55.655232] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:00.381 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:00.381 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.381 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.381 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:00.381 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:00.381 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:00.382 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.382 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.382 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.382 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.382 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.382 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.642 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.642 "name": "Existed_Raid", 00:21:00.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.642 "strip_size_kb": 0, 00:21:00.642 "state": "configuring", 00:21:00.642 "raid_level": "raid1", 00:21:00.642 "superblock": false, 00:21:00.642 "num_base_bdevs": 3, 00:21:00.642 "num_base_bdevs_discovered": 1, 00:21:00.642 "num_base_bdevs_operational": 3, 00:21:00.642 "base_bdevs_list": [ 00:21:00.642 { 00:21:00.642 "name": null, 00:21:00.642 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:21:00.642 "is_configured": false, 00:21:00.642 "data_offset": 0, 00:21:00.642 "data_size": 65536 00:21:00.642 }, 00:21:00.642 { 00:21:00.642 "name": null, 00:21:00.643 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:21:00.643 "is_configured": false, 00:21:00.643 "data_offset": 0, 00:21:00.643 "data_size": 65536 00:21:00.643 }, 00:21:00.643 { 00:21:00.643 "name": "BaseBdev3", 00:21:00.643 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:21:00.643 "is_configured": true, 00:21:00.643 "data_offset": 0, 00:21:00.643 "data_size": 65536 00:21:00.643 } 00:21:00.643 ] 00:21:00.643 }' 00:21:00.643 17:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.643 17:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:01.579 17:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.579 17:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:01.579 17:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:01.579 17:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:01.838 [2024-07-23 17:14:57.139303] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.838 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.096 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.096 "name": "Existed_Raid", 00:21:02.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.096 "strip_size_kb": 0, 00:21:02.096 "state": "configuring", 00:21:02.096 "raid_level": "raid1", 00:21:02.096 "superblock": false, 00:21:02.096 "num_base_bdevs": 3, 00:21:02.096 "num_base_bdevs_discovered": 2, 00:21:02.096 "num_base_bdevs_operational": 3, 00:21:02.096 "base_bdevs_list": [ 00:21:02.096 { 00:21:02.096 "name": null, 00:21:02.096 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:21:02.096 "is_configured": false, 00:21:02.096 "data_offset": 0, 00:21:02.096 "data_size": 65536 00:21:02.096 }, 00:21:02.096 { 00:21:02.096 "name": "BaseBdev2", 00:21:02.096 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:21:02.096 "is_configured": true, 00:21:02.096 "data_offset": 0, 00:21:02.096 "data_size": 65536 00:21:02.096 }, 00:21:02.097 { 00:21:02.097 "name": "BaseBdev3", 00:21:02.097 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:21:02.097 "is_configured": true, 00:21:02.097 "data_offset": 0, 00:21:02.097 "data_size": 65536 00:21:02.097 } 00:21:02.097 ] 00:21:02.097 }' 00:21:02.097 17:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.097 17:14:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:02.664 17:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.664 17:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:03.231 17:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:03.231 17:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:03.231 17:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.489 17:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 34c94efe-78b0-4b85-8776-b1cc6723f769 00:21:03.747 [2024-07-23 17:14:59.037532] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:03.747 [2024-07-23 17:14:59.037585] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x87dbf0 00:21:03.748 [2024-07-23 17:14:59.037594] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:03.748 [2024-07-23 17:14:59.037812] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x884400 00:21:03.748 [2024-07-23 17:14:59.037966] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x87dbf0 00:21:03.748 [2024-07-23 17:14:59.037977] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x87dbf0 00:21:03.748 [2024-07-23 17:14:59.038159] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:03.748 NewBaseBdev 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:03.748 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.006 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:04.265 [ 00:21:04.265 { 00:21:04.265 "name": "NewBaseBdev", 00:21:04.265 "aliases": [ 00:21:04.265 "34c94efe-78b0-4b85-8776-b1cc6723f769" 00:21:04.265 ], 00:21:04.265 "product_name": "Malloc disk", 00:21:04.265 "block_size": 512, 00:21:04.265 "num_blocks": 65536, 00:21:04.265 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:21:04.265 "assigned_rate_limits": { 00:21:04.265 "rw_ios_per_sec": 0, 00:21:04.265 "rw_mbytes_per_sec": 0, 00:21:04.265 "r_mbytes_per_sec": 0, 00:21:04.265 "w_mbytes_per_sec": 0 00:21:04.265 }, 00:21:04.265 "claimed": true, 00:21:04.265 "claim_type": "exclusive_write", 00:21:04.265 "zoned": false, 00:21:04.265 "supported_io_types": { 00:21:04.265 "read": true, 00:21:04.265 "write": true, 00:21:04.265 "unmap": true, 00:21:04.265 "flush": true, 00:21:04.265 "reset": true, 00:21:04.265 "nvme_admin": false, 00:21:04.265 "nvme_io": false, 00:21:04.265 "nvme_io_md": false, 00:21:04.265 "write_zeroes": true, 00:21:04.265 "zcopy": true, 00:21:04.265 "get_zone_info": false, 00:21:04.265 "zone_management": false, 00:21:04.265 "zone_append": false, 00:21:04.265 "compare": false, 00:21:04.265 "compare_and_write": false, 00:21:04.265 "abort": true, 00:21:04.265 "seek_hole": false, 00:21:04.265 "seek_data": false, 00:21:04.265 "copy": true, 00:21:04.265 "nvme_iov_md": false 00:21:04.265 }, 00:21:04.265 "memory_domains": [ 00:21:04.265 { 00:21:04.265 "dma_device_id": "system", 00:21:04.265 "dma_device_type": 1 00:21:04.265 }, 00:21:04.265 { 00:21:04.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.265 "dma_device_type": 2 00:21:04.265 } 00:21:04.265 ], 00:21:04.265 "driver_specific": {} 00:21:04.265 } 00:21:04.265 ] 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.265 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.524 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.524 "name": "Existed_Raid", 00:21:04.524 "uuid": "78ed42c6-4204-4f80-9dec-fa3f17fbf33e", 00:21:04.524 "strip_size_kb": 0, 00:21:04.524 "state": "online", 00:21:04.524 "raid_level": "raid1", 00:21:04.524 "superblock": false, 00:21:04.524 "num_base_bdevs": 3, 00:21:04.524 "num_base_bdevs_discovered": 3, 00:21:04.524 "num_base_bdevs_operational": 3, 00:21:04.524 "base_bdevs_list": [ 00:21:04.524 { 00:21:04.524 "name": "NewBaseBdev", 00:21:04.524 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:21:04.524 "is_configured": true, 00:21:04.524 "data_offset": 0, 00:21:04.524 "data_size": 65536 00:21:04.524 }, 00:21:04.524 { 00:21:04.524 "name": "BaseBdev2", 00:21:04.524 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:21:04.524 "is_configured": true, 00:21:04.524 "data_offset": 0, 00:21:04.524 "data_size": 65536 00:21:04.524 }, 00:21:04.524 { 00:21:04.524 "name": "BaseBdev3", 00:21:04.524 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:21:04.524 "is_configured": true, 00:21:04.524 "data_offset": 0, 00:21:04.524 "data_size": 65536 00:21:04.524 } 00:21:04.524 ] 00:21:04.524 }' 00:21:04.524 17:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.524 17:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:05.092 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:05.660 [2024-07-23 17:15:00.898826] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:05.660 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:05.660 "name": "Existed_Raid", 00:21:05.660 "aliases": [ 00:21:05.660 "78ed42c6-4204-4f80-9dec-fa3f17fbf33e" 00:21:05.660 ], 00:21:05.660 "product_name": "Raid Volume", 00:21:05.660 "block_size": 512, 00:21:05.660 "num_blocks": 65536, 00:21:05.660 "uuid": "78ed42c6-4204-4f80-9dec-fa3f17fbf33e", 00:21:05.660 "assigned_rate_limits": { 00:21:05.660 "rw_ios_per_sec": 0, 00:21:05.660 "rw_mbytes_per_sec": 0, 00:21:05.660 "r_mbytes_per_sec": 0, 00:21:05.660 "w_mbytes_per_sec": 0 00:21:05.660 }, 00:21:05.660 "claimed": false, 00:21:05.660 "zoned": false, 00:21:05.660 "supported_io_types": { 00:21:05.660 "read": true, 00:21:05.660 "write": true, 00:21:05.660 "unmap": false, 00:21:05.660 "flush": false, 00:21:05.660 "reset": true, 00:21:05.660 "nvme_admin": false, 00:21:05.660 "nvme_io": false, 00:21:05.660 "nvme_io_md": false, 00:21:05.660 "write_zeroes": true, 00:21:05.660 "zcopy": false, 00:21:05.660 "get_zone_info": false, 00:21:05.660 "zone_management": false, 00:21:05.660 "zone_append": false, 00:21:05.660 "compare": false, 00:21:05.660 "compare_and_write": false, 00:21:05.660 "abort": false, 00:21:05.660 "seek_hole": false, 00:21:05.660 "seek_data": false, 00:21:05.660 "copy": false, 00:21:05.660 "nvme_iov_md": false 00:21:05.660 }, 00:21:05.660 "memory_domains": [ 00:21:05.660 { 00:21:05.660 "dma_device_id": "system", 00:21:05.660 "dma_device_type": 1 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.660 "dma_device_type": 2 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "dma_device_id": "system", 00:21:05.660 "dma_device_type": 1 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.660 "dma_device_type": 2 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "dma_device_id": "system", 00:21:05.660 "dma_device_type": 1 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.660 "dma_device_type": 2 00:21:05.660 } 00:21:05.660 ], 00:21:05.660 "driver_specific": { 00:21:05.660 "raid": { 00:21:05.660 "uuid": "78ed42c6-4204-4f80-9dec-fa3f17fbf33e", 00:21:05.660 "strip_size_kb": 0, 00:21:05.660 "state": "online", 00:21:05.660 "raid_level": "raid1", 00:21:05.660 "superblock": false, 00:21:05.660 "num_base_bdevs": 3, 00:21:05.660 "num_base_bdevs_discovered": 3, 00:21:05.660 "num_base_bdevs_operational": 3, 00:21:05.660 "base_bdevs_list": [ 00:21:05.660 { 00:21:05.660 "name": "NewBaseBdev", 00:21:05.660 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:21:05.660 "is_configured": true, 00:21:05.660 "data_offset": 0, 00:21:05.660 "data_size": 65536 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "name": "BaseBdev2", 00:21:05.660 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:21:05.660 "is_configured": true, 00:21:05.660 "data_offset": 0, 00:21:05.660 "data_size": 65536 00:21:05.660 }, 00:21:05.660 { 00:21:05.660 "name": "BaseBdev3", 00:21:05.660 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:21:05.660 "is_configured": true, 00:21:05.660 "data_offset": 0, 00:21:05.660 "data_size": 65536 00:21:05.660 } 00:21:05.660 ] 00:21:05.660 } 00:21:05.660 } 00:21:05.660 }' 00:21:05.660 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:05.660 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:05.660 BaseBdev2 00:21:05.660 BaseBdev3' 00:21:05.660 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:05.660 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:05.660 17:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:05.934 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:05.934 "name": "NewBaseBdev", 00:21:05.934 "aliases": [ 00:21:05.934 "34c94efe-78b0-4b85-8776-b1cc6723f769" 00:21:05.934 ], 00:21:05.934 "product_name": "Malloc disk", 00:21:05.934 "block_size": 512, 00:21:05.934 "num_blocks": 65536, 00:21:05.934 "uuid": "34c94efe-78b0-4b85-8776-b1cc6723f769", 00:21:05.934 "assigned_rate_limits": { 00:21:05.934 "rw_ios_per_sec": 0, 00:21:05.934 "rw_mbytes_per_sec": 0, 00:21:05.934 "r_mbytes_per_sec": 0, 00:21:05.934 "w_mbytes_per_sec": 0 00:21:05.934 }, 00:21:05.934 "claimed": true, 00:21:05.934 "claim_type": "exclusive_write", 00:21:05.934 "zoned": false, 00:21:05.934 "supported_io_types": { 00:21:05.934 "read": true, 00:21:05.934 "write": true, 00:21:05.934 "unmap": true, 00:21:05.934 "flush": true, 00:21:05.934 "reset": true, 00:21:05.934 "nvme_admin": false, 00:21:05.934 "nvme_io": false, 00:21:05.934 "nvme_io_md": false, 00:21:05.934 "write_zeroes": true, 00:21:05.934 "zcopy": true, 00:21:05.934 "get_zone_info": false, 00:21:05.934 "zone_management": false, 00:21:05.934 "zone_append": false, 00:21:05.934 "compare": false, 00:21:05.934 "compare_and_write": false, 00:21:05.934 "abort": true, 00:21:05.934 "seek_hole": false, 00:21:05.934 "seek_data": false, 00:21:05.934 "copy": true, 00:21:05.934 "nvme_iov_md": false 00:21:05.934 }, 00:21:05.934 "memory_domains": [ 00:21:05.934 { 00:21:05.934 "dma_device_id": "system", 00:21:05.934 "dma_device_type": 1 00:21:05.934 }, 00:21:05.934 { 00:21:05.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.934 "dma_device_type": 2 00:21:05.934 } 00:21:05.934 ], 00:21:05.934 "driver_specific": {} 00:21:05.934 }' 00:21:05.934 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:05.934 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:05.934 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:05.934 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:06.194 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.453 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.453 "name": "BaseBdev2", 00:21:06.453 "aliases": [ 00:21:06.453 "fae69e27-4f7a-4a31-b621-1625a869d5c7" 00:21:06.453 ], 00:21:06.453 "product_name": "Malloc disk", 00:21:06.453 "block_size": 512, 00:21:06.453 "num_blocks": 65536, 00:21:06.453 "uuid": "fae69e27-4f7a-4a31-b621-1625a869d5c7", 00:21:06.453 "assigned_rate_limits": { 00:21:06.453 "rw_ios_per_sec": 0, 00:21:06.453 "rw_mbytes_per_sec": 0, 00:21:06.453 "r_mbytes_per_sec": 0, 00:21:06.453 "w_mbytes_per_sec": 0 00:21:06.453 }, 00:21:06.453 "claimed": true, 00:21:06.453 "claim_type": "exclusive_write", 00:21:06.453 "zoned": false, 00:21:06.453 "supported_io_types": { 00:21:06.453 "read": true, 00:21:06.453 "write": true, 00:21:06.453 "unmap": true, 00:21:06.453 "flush": true, 00:21:06.453 "reset": true, 00:21:06.453 "nvme_admin": false, 00:21:06.453 "nvme_io": false, 00:21:06.453 "nvme_io_md": false, 00:21:06.453 "write_zeroes": true, 00:21:06.453 "zcopy": true, 00:21:06.453 "get_zone_info": false, 00:21:06.453 "zone_management": false, 00:21:06.453 "zone_append": false, 00:21:06.453 "compare": false, 00:21:06.453 "compare_and_write": false, 00:21:06.453 "abort": true, 00:21:06.453 "seek_hole": false, 00:21:06.453 "seek_data": false, 00:21:06.453 "copy": true, 00:21:06.453 "nvme_iov_md": false 00:21:06.453 }, 00:21:06.453 "memory_domains": [ 00:21:06.453 { 00:21:06.453 "dma_device_id": "system", 00:21:06.453 "dma_device_type": 1 00:21:06.453 }, 00:21:06.453 { 00:21:06.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.453 "dma_device_type": 2 00:21:06.453 } 00:21:06.453 ], 00:21:06.453 "driver_specific": {} 00:21:06.453 }' 00:21:06.453 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.712 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.712 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:06.712 17:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.712 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.712 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.712 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:06.972 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.541 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.541 "name": "BaseBdev3", 00:21:07.541 "aliases": [ 00:21:07.541 "aad3f898-dee5-4ce9-9a27-f10c1b59f987" 00:21:07.541 ], 00:21:07.541 "product_name": "Malloc disk", 00:21:07.541 "block_size": 512, 00:21:07.541 "num_blocks": 65536, 00:21:07.541 "uuid": "aad3f898-dee5-4ce9-9a27-f10c1b59f987", 00:21:07.541 "assigned_rate_limits": { 00:21:07.541 "rw_ios_per_sec": 0, 00:21:07.541 "rw_mbytes_per_sec": 0, 00:21:07.541 "r_mbytes_per_sec": 0, 00:21:07.541 "w_mbytes_per_sec": 0 00:21:07.541 }, 00:21:07.541 "claimed": true, 00:21:07.541 "claim_type": "exclusive_write", 00:21:07.541 "zoned": false, 00:21:07.541 "supported_io_types": { 00:21:07.541 "read": true, 00:21:07.541 "write": true, 00:21:07.541 "unmap": true, 00:21:07.541 "flush": true, 00:21:07.541 "reset": true, 00:21:07.541 "nvme_admin": false, 00:21:07.541 "nvme_io": false, 00:21:07.541 "nvme_io_md": false, 00:21:07.541 "write_zeroes": true, 00:21:07.541 "zcopy": true, 00:21:07.541 "get_zone_info": false, 00:21:07.541 "zone_management": false, 00:21:07.541 "zone_append": false, 00:21:07.541 "compare": false, 00:21:07.541 "compare_and_write": false, 00:21:07.541 "abort": true, 00:21:07.541 "seek_hole": false, 00:21:07.541 "seek_data": false, 00:21:07.541 "copy": true, 00:21:07.541 "nvme_iov_md": false 00:21:07.541 }, 00:21:07.541 "memory_domains": [ 00:21:07.541 { 00:21:07.541 "dma_device_id": "system", 00:21:07.541 "dma_device_type": 1 00:21:07.541 }, 00:21:07.541 { 00:21:07.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.541 "dma_device_type": 2 00:21:07.541 } 00:21:07.541 ], 00:21:07.541 "driver_specific": {} 00:21:07.541 }' 00:21:07.541 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.541 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.801 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.801 17:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.801 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.801 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:07.801 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.801 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.801 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:07.801 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.059 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.059 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.059 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:08.318 [2024-07-23 17:15:03.609697] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:08.318 [2024-07-23 17:15:03.609729] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:08.318 [2024-07-23 17:15:03.609795] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:08.318 [2024-07-23 17:15:03.610086] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:08.318 [2024-07-23 17:15:03.610100] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x87dbf0 name Existed_Raid, state offline 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4156196 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4156196 ']' 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4156196 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4156196 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:08.318 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:08.319 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4156196' 00:21:08.319 killing process with pid 4156196 00:21:08.319 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4156196 00:21:08.319 [2024-07-23 17:15:03.692086] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:08.319 17:15:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4156196 00:21:08.577 [2024-07-23 17:15:03.756685] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:08.835 00:21:08.835 real 0m34.247s 00:21:08.835 user 1m2.661s 00:21:08.835 sys 0m5.981s 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.835 ************************************ 00:21:08.835 END TEST raid_state_function_test 00:21:08.835 ************************************ 00:21:08.835 17:15:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:08.835 17:15:04 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:21:08.835 17:15:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:08.835 17:15:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:08.835 17:15:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:08.835 ************************************ 00:21:08.835 START TEST raid_state_function_test_sb 00:21:08.835 ************************************ 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:08.835 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4161422 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4161422' 00:21:08.836 Process raid pid: 4161422 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4161422 /var/tmp/spdk-raid.sock 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4161422 ']' 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:08.836 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:08.836 17:15:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:09.094 [2024-07-23 17:15:04.310591] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:21:09.094 [2024-07-23 17:15:04.310658] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:09.094 [2024-07-23 17:15:04.443942] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:09.094 [2024-07-23 17:15:04.494742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:09.352 [2024-07-23 17:15:04.552346] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:09.352 [2024-07-23 17:15:04.552371] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:09.917 17:15:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:09.917 17:15:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:09.917 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:21:10.174 [2024-07-23 17:15:05.470539] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:10.174 [2024-07-23 17:15:05.470580] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:10.174 [2024-07-23 17:15:05.470591] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:10.174 [2024-07-23 17:15:05.470603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:10.174 [2024-07-23 17:15:05.470612] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:10.174 [2024-07-23 17:15:05.470623] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.174 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.432 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.432 "name": "Existed_Raid", 00:21:10.432 "uuid": "19bc196c-1b4b-4073-928d-8715470c1df5", 00:21:10.432 "strip_size_kb": 0, 00:21:10.432 "state": "configuring", 00:21:10.432 "raid_level": "raid1", 00:21:10.432 "superblock": true, 00:21:10.432 "num_base_bdevs": 3, 00:21:10.432 "num_base_bdevs_discovered": 0, 00:21:10.432 "num_base_bdevs_operational": 3, 00:21:10.432 "base_bdevs_list": [ 00:21:10.432 { 00:21:10.432 "name": "BaseBdev1", 00:21:10.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.432 "is_configured": false, 00:21:10.432 "data_offset": 0, 00:21:10.432 "data_size": 0 00:21:10.432 }, 00:21:10.432 { 00:21:10.432 "name": "BaseBdev2", 00:21:10.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.432 "is_configured": false, 00:21:10.432 "data_offset": 0, 00:21:10.432 "data_size": 0 00:21:10.432 }, 00:21:10.432 { 00:21:10.432 "name": "BaseBdev3", 00:21:10.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.432 "is_configured": false, 00:21:10.432 "data_offset": 0, 00:21:10.432 "data_size": 0 00:21:10.432 } 00:21:10.432 ] 00:21:10.432 }' 00:21:10.433 17:15:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.433 17:15:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:11.366 17:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:11.366 [2024-07-23 17:15:06.717698] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:11.366 [2024-07-23 17:15:06.717730] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b3280 name Existed_Raid, state configuring 00:21:11.366 17:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:21:11.624 [2024-07-23 17:15:06.970388] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:11.624 [2024-07-23 17:15:06.970426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:11.624 [2024-07-23 17:15:06.970436] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:11.624 [2024-07-23 17:15:06.970448] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:11.624 [2024-07-23 17:15:06.970457] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:11.624 [2024-07-23 17:15:06.970468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:11.624 17:15:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:11.882 [2024-07-23 17:15:07.224773] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:11.882 BaseBdev1 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:11.882 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:12.156 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:12.426 [ 00:21:12.426 { 00:21:12.426 "name": "BaseBdev1", 00:21:12.426 "aliases": [ 00:21:12.426 "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2" 00:21:12.426 ], 00:21:12.426 "product_name": "Malloc disk", 00:21:12.426 "block_size": 512, 00:21:12.426 "num_blocks": 65536, 00:21:12.426 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:12.426 "assigned_rate_limits": { 00:21:12.426 "rw_ios_per_sec": 0, 00:21:12.426 "rw_mbytes_per_sec": 0, 00:21:12.426 "r_mbytes_per_sec": 0, 00:21:12.426 "w_mbytes_per_sec": 0 00:21:12.426 }, 00:21:12.426 "claimed": true, 00:21:12.426 "claim_type": "exclusive_write", 00:21:12.426 "zoned": false, 00:21:12.426 "supported_io_types": { 00:21:12.426 "read": true, 00:21:12.426 "write": true, 00:21:12.426 "unmap": true, 00:21:12.426 "flush": true, 00:21:12.426 "reset": true, 00:21:12.426 "nvme_admin": false, 00:21:12.426 "nvme_io": false, 00:21:12.426 "nvme_io_md": false, 00:21:12.426 "write_zeroes": true, 00:21:12.426 "zcopy": true, 00:21:12.426 "get_zone_info": false, 00:21:12.426 "zone_management": false, 00:21:12.426 "zone_append": false, 00:21:12.426 "compare": false, 00:21:12.426 "compare_and_write": false, 00:21:12.426 "abort": true, 00:21:12.426 "seek_hole": false, 00:21:12.426 "seek_data": false, 00:21:12.426 "copy": true, 00:21:12.426 "nvme_iov_md": false 00:21:12.426 }, 00:21:12.426 "memory_domains": [ 00:21:12.426 { 00:21:12.426 "dma_device_id": "system", 00:21:12.426 "dma_device_type": 1 00:21:12.426 }, 00:21:12.426 { 00:21:12.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.426 "dma_device_type": 2 00:21:12.426 } 00:21:12.426 ], 00:21:12.426 "driver_specific": {} 00:21:12.426 } 00:21:12.426 ] 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.426 17:15:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.683 17:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.683 "name": "Existed_Raid", 00:21:12.683 "uuid": "79f3fbf9-c911-4c8b-8201-236308f81686", 00:21:12.683 "strip_size_kb": 0, 00:21:12.683 "state": "configuring", 00:21:12.683 "raid_level": "raid1", 00:21:12.683 "superblock": true, 00:21:12.683 "num_base_bdevs": 3, 00:21:12.683 "num_base_bdevs_discovered": 1, 00:21:12.683 "num_base_bdevs_operational": 3, 00:21:12.683 "base_bdevs_list": [ 00:21:12.683 { 00:21:12.683 "name": "BaseBdev1", 00:21:12.683 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:12.683 "is_configured": true, 00:21:12.683 "data_offset": 2048, 00:21:12.683 "data_size": 63488 00:21:12.683 }, 00:21:12.683 { 00:21:12.683 "name": "BaseBdev2", 00:21:12.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.683 "is_configured": false, 00:21:12.683 "data_offset": 0, 00:21:12.683 "data_size": 0 00:21:12.683 }, 00:21:12.683 { 00:21:12.683 "name": "BaseBdev3", 00:21:12.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.683 "is_configured": false, 00:21:12.683 "data_offset": 0, 00:21:12.683 "data_size": 0 00:21:12.683 } 00:21:12.683 ] 00:21:12.683 }' 00:21:12.683 17:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.683 17:15:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:13.248 17:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:13.506 [2024-07-23 17:15:08.812961] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:13.506 [2024-07-23 17:15:08.813004] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b2bb0 name Existed_Raid, state configuring 00:21:13.506 17:15:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:21:13.764 [2024-07-23 17:15:09.065661] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:13.764 [2024-07-23 17:15:09.067080] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:13.764 [2024-07-23 17:15:09.067112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:13.764 [2024-07-23 17:15:09.067122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:13.764 [2024-07-23 17:15:09.067134] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.764 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.022 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.022 "name": "Existed_Raid", 00:21:14.022 "uuid": "15819ae6-91ab-4d66-b028-25a5a9355107", 00:21:14.022 "strip_size_kb": 0, 00:21:14.022 "state": "configuring", 00:21:14.022 "raid_level": "raid1", 00:21:14.022 "superblock": true, 00:21:14.022 "num_base_bdevs": 3, 00:21:14.022 "num_base_bdevs_discovered": 1, 00:21:14.022 "num_base_bdevs_operational": 3, 00:21:14.022 "base_bdevs_list": [ 00:21:14.022 { 00:21:14.022 "name": "BaseBdev1", 00:21:14.022 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:14.022 "is_configured": true, 00:21:14.022 "data_offset": 2048, 00:21:14.022 "data_size": 63488 00:21:14.022 }, 00:21:14.022 { 00:21:14.022 "name": "BaseBdev2", 00:21:14.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.022 "is_configured": false, 00:21:14.022 "data_offset": 0, 00:21:14.022 "data_size": 0 00:21:14.022 }, 00:21:14.022 { 00:21:14.022 "name": "BaseBdev3", 00:21:14.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.022 "is_configured": false, 00:21:14.022 "data_offset": 0, 00:21:14.022 "data_size": 0 00:21:14.022 } 00:21:14.022 ] 00:21:14.022 }' 00:21:14.022 17:15:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.022 17:15:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:14.955 [2024-07-23 17:15:10.268693] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:14.955 BaseBdev2 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:14.955 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:15.213 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:15.471 [ 00:21:15.471 { 00:21:15.471 "name": "BaseBdev2", 00:21:15.471 "aliases": [ 00:21:15.471 "af4a8296-c3fa-4461-9efd-e699fa52c8a6" 00:21:15.471 ], 00:21:15.471 "product_name": "Malloc disk", 00:21:15.471 "block_size": 512, 00:21:15.471 "num_blocks": 65536, 00:21:15.471 "uuid": "af4a8296-c3fa-4461-9efd-e699fa52c8a6", 00:21:15.471 "assigned_rate_limits": { 00:21:15.471 "rw_ios_per_sec": 0, 00:21:15.471 "rw_mbytes_per_sec": 0, 00:21:15.471 "r_mbytes_per_sec": 0, 00:21:15.471 "w_mbytes_per_sec": 0 00:21:15.471 }, 00:21:15.471 "claimed": true, 00:21:15.471 "claim_type": "exclusive_write", 00:21:15.471 "zoned": false, 00:21:15.471 "supported_io_types": { 00:21:15.471 "read": true, 00:21:15.471 "write": true, 00:21:15.471 "unmap": true, 00:21:15.471 "flush": true, 00:21:15.471 "reset": true, 00:21:15.471 "nvme_admin": false, 00:21:15.471 "nvme_io": false, 00:21:15.471 "nvme_io_md": false, 00:21:15.471 "write_zeroes": true, 00:21:15.471 "zcopy": true, 00:21:15.471 "get_zone_info": false, 00:21:15.471 "zone_management": false, 00:21:15.471 "zone_append": false, 00:21:15.471 "compare": false, 00:21:15.471 "compare_and_write": false, 00:21:15.471 "abort": true, 00:21:15.471 "seek_hole": false, 00:21:15.471 "seek_data": false, 00:21:15.471 "copy": true, 00:21:15.471 "nvme_iov_md": false 00:21:15.471 }, 00:21:15.471 "memory_domains": [ 00:21:15.471 { 00:21:15.472 "dma_device_id": "system", 00:21:15.472 "dma_device_type": 1 00:21:15.472 }, 00:21:15.472 { 00:21:15.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.472 "dma_device_type": 2 00:21:15.472 } 00:21:15.472 ], 00:21:15.472 "driver_specific": {} 00:21:15.472 } 00:21:15.472 ] 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.472 17:15:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.730 17:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.730 "name": "Existed_Raid", 00:21:15.730 "uuid": "15819ae6-91ab-4d66-b028-25a5a9355107", 00:21:15.730 "strip_size_kb": 0, 00:21:15.730 "state": "configuring", 00:21:15.730 "raid_level": "raid1", 00:21:15.730 "superblock": true, 00:21:15.730 "num_base_bdevs": 3, 00:21:15.730 "num_base_bdevs_discovered": 2, 00:21:15.730 "num_base_bdevs_operational": 3, 00:21:15.730 "base_bdevs_list": [ 00:21:15.730 { 00:21:15.730 "name": "BaseBdev1", 00:21:15.730 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:15.730 "is_configured": true, 00:21:15.730 "data_offset": 2048, 00:21:15.730 "data_size": 63488 00:21:15.730 }, 00:21:15.730 { 00:21:15.730 "name": "BaseBdev2", 00:21:15.730 "uuid": "af4a8296-c3fa-4461-9efd-e699fa52c8a6", 00:21:15.730 "is_configured": true, 00:21:15.730 "data_offset": 2048, 00:21:15.730 "data_size": 63488 00:21:15.730 }, 00:21:15.730 { 00:21:15.730 "name": "BaseBdev3", 00:21:15.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.730 "is_configured": false, 00:21:15.730 "data_offset": 0, 00:21:15.730 "data_size": 0 00:21:15.730 } 00:21:15.730 ] 00:21:15.730 }' 00:21:15.730 17:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.730 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:16.295 17:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:16.553 [2024-07-23 17:15:11.880314] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:16.553 [2024-07-23 17:15:11.880474] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11b2800 00:21:16.553 [2024-07-23 17:15:11.880489] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:16.553 [2024-07-23 17:15:11.880668] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11b6b50 00:21:16.553 [2024-07-23 17:15:11.880795] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11b2800 00:21:16.553 [2024-07-23 17:15:11.880806] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11b2800 00:21:16.553 [2024-07-23 17:15:11.880908] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:16.553 BaseBdev3 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:16.553 17:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:16.811 17:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:17.070 [ 00:21:17.070 { 00:21:17.070 "name": "BaseBdev3", 00:21:17.070 "aliases": [ 00:21:17.070 "96544869-ea4f-4ffe-9275-612f063abc79" 00:21:17.070 ], 00:21:17.070 "product_name": "Malloc disk", 00:21:17.070 "block_size": 512, 00:21:17.070 "num_blocks": 65536, 00:21:17.070 "uuid": "96544869-ea4f-4ffe-9275-612f063abc79", 00:21:17.070 "assigned_rate_limits": { 00:21:17.070 "rw_ios_per_sec": 0, 00:21:17.070 "rw_mbytes_per_sec": 0, 00:21:17.070 "r_mbytes_per_sec": 0, 00:21:17.070 "w_mbytes_per_sec": 0 00:21:17.070 }, 00:21:17.070 "claimed": true, 00:21:17.070 "claim_type": "exclusive_write", 00:21:17.070 "zoned": false, 00:21:17.070 "supported_io_types": { 00:21:17.070 "read": true, 00:21:17.070 "write": true, 00:21:17.070 "unmap": true, 00:21:17.070 "flush": true, 00:21:17.070 "reset": true, 00:21:17.070 "nvme_admin": false, 00:21:17.070 "nvme_io": false, 00:21:17.070 "nvme_io_md": false, 00:21:17.070 "write_zeroes": true, 00:21:17.070 "zcopy": true, 00:21:17.070 "get_zone_info": false, 00:21:17.070 "zone_management": false, 00:21:17.070 "zone_append": false, 00:21:17.070 "compare": false, 00:21:17.070 "compare_and_write": false, 00:21:17.070 "abort": true, 00:21:17.070 "seek_hole": false, 00:21:17.070 "seek_data": false, 00:21:17.070 "copy": true, 00:21:17.070 "nvme_iov_md": false 00:21:17.070 }, 00:21:17.070 "memory_domains": [ 00:21:17.070 { 00:21:17.070 "dma_device_id": "system", 00:21:17.070 "dma_device_type": 1 00:21:17.070 }, 00:21:17.070 { 00:21:17.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.070 "dma_device_type": 2 00:21:17.070 } 00:21:17.070 ], 00:21:17.070 "driver_specific": {} 00:21:17.070 } 00:21:17.070 ] 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.070 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.328 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.328 "name": "Existed_Raid", 00:21:17.328 "uuid": "15819ae6-91ab-4d66-b028-25a5a9355107", 00:21:17.328 "strip_size_kb": 0, 00:21:17.328 "state": "online", 00:21:17.328 "raid_level": "raid1", 00:21:17.328 "superblock": true, 00:21:17.328 "num_base_bdevs": 3, 00:21:17.328 "num_base_bdevs_discovered": 3, 00:21:17.328 "num_base_bdevs_operational": 3, 00:21:17.328 "base_bdevs_list": [ 00:21:17.328 { 00:21:17.328 "name": "BaseBdev1", 00:21:17.328 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:17.328 "is_configured": true, 00:21:17.328 "data_offset": 2048, 00:21:17.328 "data_size": 63488 00:21:17.328 }, 00:21:17.328 { 00:21:17.328 "name": "BaseBdev2", 00:21:17.328 "uuid": "af4a8296-c3fa-4461-9efd-e699fa52c8a6", 00:21:17.328 "is_configured": true, 00:21:17.328 "data_offset": 2048, 00:21:17.328 "data_size": 63488 00:21:17.328 }, 00:21:17.328 { 00:21:17.328 "name": "BaseBdev3", 00:21:17.328 "uuid": "96544869-ea4f-4ffe-9275-612f063abc79", 00:21:17.328 "is_configured": true, 00:21:17.328 "data_offset": 2048, 00:21:17.328 "data_size": 63488 00:21:17.328 } 00:21:17.328 ] 00:21:17.328 }' 00:21:17.328 17:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.328 17:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:18.265 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:18.265 [2024-07-23 17:15:13.677390] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:18.524 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:18.524 "name": "Existed_Raid", 00:21:18.524 "aliases": [ 00:21:18.524 "15819ae6-91ab-4d66-b028-25a5a9355107" 00:21:18.524 ], 00:21:18.524 "product_name": "Raid Volume", 00:21:18.524 "block_size": 512, 00:21:18.524 "num_blocks": 63488, 00:21:18.524 "uuid": "15819ae6-91ab-4d66-b028-25a5a9355107", 00:21:18.524 "assigned_rate_limits": { 00:21:18.524 "rw_ios_per_sec": 0, 00:21:18.524 "rw_mbytes_per_sec": 0, 00:21:18.524 "r_mbytes_per_sec": 0, 00:21:18.524 "w_mbytes_per_sec": 0 00:21:18.524 }, 00:21:18.524 "claimed": false, 00:21:18.524 "zoned": false, 00:21:18.524 "supported_io_types": { 00:21:18.524 "read": true, 00:21:18.524 "write": true, 00:21:18.524 "unmap": false, 00:21:18.524 "flush": false, 00:21:18.524 "reset": true, 00:21:18.524 "nvme_admin": false, 00:21:18.524 "nvme_io": false, 00:21:18.524 "nvme_io_md": false, 00:21:18.524 "write_zeroes": true, 00:21:18.524 "zcopy": false, 00:21:18.524 "get_zone_info": false, 00:21:18.524 "zone_management": false, 00:21:18.524 "zone_append": false, 00:21:18.524 "compare": false, 00:21:18.524 "compare_and_write": false, 00:21:18.524 "abort": false, 00:21:18.524 "seek_hole": false, 00:21:18.524 "seek_data": false, 00:21:18.524 "copy": false, 00:21:18.524 "nvme_iov_md": false 00:21:18.524 }, 00:21:18.524 "memory_domains": [ 00:21:18.524 { 00:21:18.524 "dma_device_id": "system", 00:21:18.524 "dma_device_type": 1 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.524 "dma_device_type": 2 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "dma_device_id": "system", 00:21:18.524 "dma_device_type": 1 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.524 "dma_device_type": 2 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "dma_device_id": "system", 00:21:18.524 "dma_device_type": 1 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.524 "dma_device_type": 2 00:21:18.524 } 00:21:18.524 ], 00:21:18.524 "driver_specific": { 00:21:18.524 "raid": { 00:21:18.524 "uuid": "15819ae6-91ab-4d66-b028-25a5a9355107", 00:21:18.524 "strip_size_kb": 0, 00:21:18.524 "state": "online", 00:21:18.524 "raid_level": "raid1", 00:21:18.524 "superblock": true, 00:21:18.524 "num_base_bdevs": 3, 00:21:18.524 "num_base_bdevs_discovered": 3, 00:21:18.524 "num_base_bdevs_operational": 3, 00:21:18.524 "base_bdevs_list": [ 00:21:18.524 { 00:21:18.524 "name": "BaseBdev1", 00:21:18.524 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:18.524 "is_configured": true, 00:21:18.524 "data_offset": 2048, 00:21:18.524 "data_size": 63488 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "name": "BaseBdev2", 00:21:18.524 "uuid": "af4a8296-c3fa-4461-9efd-e699fa52c8a6", 00:21:18.524 "is_configured": true, 00:21:18.524 "data_offset": 2048, 00:21:18.524 "data_size": 63488 00:21:18.524 }, 00:21:18.524 { 00:21:18.524 "name": "BaseBdev3", 00:21:18.524 "uuid": "96544869-ea4f-4ffe-9275-612f063abc79", 00:21:18.524 "is_configured": true, 00:21:18.524 "data_offset": 2048, 00:21:18.524 "data_size": 63488 00:21:18.524 } 00:21:18.524 ] 00:21:18.524 } 00:21:18.524 } 00:21:18.524 }' 00:21:18.524 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:18.524 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:18.524 BaseBdev2 00:21:18.524 BaseBdev3' 00:21:18.524 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:18.524 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:18.524 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.783 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.783 "name": "BaseBdev1", 00:21:18.783 "aliases": [ 00:21:18.783 "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2" 00:21:18.783 ], 00:21:18.783 "product_name": "Malloc disk", 00:21:18.783 "block_size": 512, 00:21:18.783 "num_blocks": 65536, 00:21:18.783 "uuid": "0e8609bd-c4bc-449c-8bbd-b1f53fb212a2", 00:21:18.783 "assigned_rate_limits": { 00:21:18.783 "rw_ios_per_sec": 0, 00:21:18.783 "rw_mbytes_per_sec": 0, 00:21:18.783 "r_mbytes_per_sec": 0, 00:21:18.783 "w_mbytes_per_sec": 0 00:21:18.783 }, 00:21:18.783 "claimed": true, 00:21:18.783 "claim_type": "exclusive_write", 00:21:18.783 "zoned": false, 00:21:18.783 "supported_io_types": { 00:21:18.783 "read": true, 00:21:18.783 "write": true, 00:21:18.783 "unmap": true, 00:21:18.783 "flush": true, 00:21:18.783 "reset": true, 00:21:18.783 "nvme_admin": false, 00:21:18.783 "nvme_io": false, 00:21:18.783 "nvme_io_md": false, 00:21:18.783 "write_zeroes": true, 00:21:18.783 "zcopy": true, 00:21:18.783 "get_zone_info": false, 00:21:18.783 "zone_management": false, 00:21:18.783 "zone_append": false, 00:21:18.783 "compare": false, 00:21:18.783 "compare_and_write": false, 00:21:18.783 "abort": true, 00:21:18.783 "seek_hole": false, 00:21:18.783 "seek_data": false, 00:21:18.783 "copy": true, 00:21:18.783 "nvme_iov_md": false 00:21:18.783 }, 00:21:18.783 "memory_domains": [ 00:21:18.783 { 00:21:18.783 "dma_device_id": "system", 00:21:18.783 "dma_device_type": 1 00:21:18.783 }, 00:21:18.783 { 00:21:18.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.783 "dma_device_type": 2 00:21:18.783 } 00:21:18.783 ], 00:21:18.783 "driver_specific": {} 00:21:18.783 }' 00:21:18.783 17:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.783 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.783 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.783 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.783 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.783 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.783 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:19.042 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:19.301 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:19.301 "name": "BaseBdev2", 00:21:19.301 "aliases": [ 00:21:19.301 "af4a8296-c3fa-4461-9efd-e699fa52c8a6" 00:21:19.301 ], 00:21:19.301 "product_name": "Malloc disk", 00:21:19.301 "block_size": 512, 00:21:19.301 "num_blocks": 65536, 00:21:19.301 "uuid": "af4a8296-c3fa-4461-9efd-e699fa52c8a6", 00:21:19.301 "assigned_rate_limits": { 00:21:19.301 "rw_ios_per_sec": 0, 00:21:19.301 "rw_mbytes_per_sec": 0, 00:21:19.301 "r_mbytes_per_sec": 0, 00:21:19.301 "w_mbytes_per_sec": 0 00:21:19.301 }, 00:21:19.301 "claimed": true, 00:21:19.301 "claim_type": "exclusive_write", 00:21:19.301 "zoned": false, 00:21:19.301 "supported_io_types": { 00:21:19.301 "read": true, 00:21:19.301 "write": true, 00:21:19.301 "unmap": true, 00:21:19.301 "flush": true, 00:21:19.301 "reset": true, 00:21:19.301 "nvme_admin": false, 00:21:19.301 "nvme_io": false, 00:21:19.301 "nvme_io_md": false, 00:21:19.301 "write_zeroes": true, 00:21:19.301 "zcopy": true, 00:21:19.301 "get_zone_info": false, 00:21:19.301 "zone_management": false, 00:21:19.301 "zone_append": false, 00:21:19.301 "compare": false, 00:21:19.301 "compare_and_write": false, 00:21:19.301 "abort": true, 00:21:19.301 "seek_hole": false, 00:21:19.301 "seek_data": false, 00:21:19.301 "copy": true, 00:21:19.301 "nvme_iov_md": false 00:21:19.301 }, 00:21:19.301 "memory_domains": [ 00:21:19.301 { 00:21:19.301 "dma_device_id": "system", 00:21:19.301 "dma_device_type": 1 00:21:19.301 }, 00:21:19.301 { 00:21:19.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.301 "dma_device_type": 2 00:21:19.301 } 00:21:19.301 ], 00:21:19.301 "driver_specific": {} 00:21:19.301 }' 00:21:19.301 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.301 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.301 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:19.301 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.559 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.819 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:19.819 17:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:19.819 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:19.819 "name": "BaseBdev3", 00:21:19.819 "aliases": [ 00:21:19.819 "96544869-ea4f-4ffe-9275-612f063abc79" 00:21:19.819 ], 00:21:19.819 "product_name": "Malloc disk", 00:21:19.819 "block_size": 512, 00:21:19.819 "num_blocks": 65536, 00:21:19.819 "uuid": "96544869-ea4f-4ffe-9275-612f063abc79", 00:21:19.819 "assigned_rate_limits": { 00:21:19.819 "rw_ios_per_sec": 0, 00:21:19.819 "rw_mbytes_per_sec": 0, 00:21:19.819 "r_mbytes_per_sec": 0, 00:21:19.819 "w_mbytes_per_sec": 0 00:21:19.819 }, 00:21:19.819 "claimed": true, 00:21:19.819 "claim_type": "exclusive_write", 00:21:19.819 "zoned": false, 00:21:19.819 "supported_io_types": { 00:21:19.819 "read": true, 00:21:19.819 "write": true, 00:21:19.819 "unmap": true, 00:21:19.819 "flush": true, 00:21:19.819 "reset": true, 00:21:19.819 "nvme_admin": false, 00:21:19.819 "nvme_io": false, 00:21:19.819 "nvme_io_md": false, 00:21:19.819 "write_zeroes": true, 00:21:19.819 "zcopy": true, 00:21:19.819 "get_zone_info": false, 00:21:19.819 "zone_management": false, 00:21:19.819 "zone_append": false, 00:21:19.819 "compare": false, 00:21:19.819 "compare_and_write": false, 00:21:19.819 "abort": true, 00:21:19.819 "seek_hole": false, 00:21:19.819 "seek_data": false, 00:21:19.819 "copy": true, 00:21:19.819 "nvme_iov_md": false 00:21:19.819 }, 00:21:19.819 "memory_domains": [ 00:21:19.819 { 00:21:19.819 "dma_device_id": "system", 00:21:19.819 "dma_device_type": 1 00:21:19.819 }, 00:21:19.819 { 00:21:19.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.819 "dma_device_type": 2 00:21:19.819 } 00:21:19.819 ], 00:21:19.819 "driver_specific": {} 00:21:19.819 }' 00:21:19.819 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:20.078 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.337 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.337 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:20.337 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:20.596 [2024-07-23 17:15:15.798806] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:20.596 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.597 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.597 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.597 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.597 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.597 17:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.856 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.856 "name": "Existed_Raid", 00:21:20.856 "uuid": "15819ae6-91ab-4d66-b028-25a5a9355107", 00:21:20.856 "strip_size_kb": 0, 00:21:20.856 "state": "online", 00:21:20.856 "raid_level": "raid1", 00:21:20.856 "superblock": true, 00:21:20.856 "num_base_bdevs": 3, 00:21:20.856 "num_base_bdevs_discovered": 2, 00:21:20.856 "num_base_bdevs_operational": 2, 00:21:20.856 "base_bdevs_list": [ 00:21:20.856 { 00:21:20.856 "name": null, 00:21:20.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.856 "is_configured": false, 00:21:20.856 "data_offset": 2048, 00:21:20.856 "data_size": 63488 00:21:20.856 }, 00:21:20.856 { 00:21:20.856 "name": "BaseBdev2", 00:21:20.856 "uuid": "af4a8296-c3fa-4461-9efd-e699fa52c8a6", 00:21:20.856 "is_configured": true, 00:21:20.856 "data_offset": 2048, 00:21:20.856 "data_size": 63488 00:21:20.856 }, 00:21:20.856 { 00:21:20.856 "name": "BaseBdev3", 00:21:20.856 "uuid": "96544869-ea4f-4ffe-9275-612f063abc79", 00:21:20.856 "is_configured": true, 00:21:20.856 "data_offset": 2048, 00:21:20.856 "data_size": 63488 00:21:20.856 } 00:21:20.856 ] 00:21:20.856 }' 00:21:20.856 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.856 17:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:21.423 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:21.423 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:21.423 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.423 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:21.682 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:21.682 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:21.682 17:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:21.941 [2024-07-23 17:15:17.116259] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:21.941 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:21.941 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:21.941 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.941 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:22.200 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:22.200 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:22.200 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:22.767 [2024-07-23 17:15:17.901107] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:22.767 [2024-07-23 17:15:17.901199] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:22.767 [2024-07-23 17:15:17.913875] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:22.767 [2024-07-23 17:15:17.913935] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:22.767 [2024-07-23 17:15:17.913947] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b2800 name Existed_Raid, state offline 00:21:22.767 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:22.767 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:22.767 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.767 17:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:23.026 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:23.026 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:23.026 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:21:23.026 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:23.026 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:23.026 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:23.285 BaseBdev2 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:23.543 17:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:23.802 [ 00:21:23.802 { 00:21:23.802 "name": "BaseBdev2", 00:21:23.802 "aliases": [ 00:21:23.802 "6af1ebb7-764a-4923-82e4-64df4419c2b5" 00:21:23.802 ], 00:21:23.802 "product_name": "Malloc disk", 00:21:23.802 "block_size": 512, 00:21:23.802 "num_blocks": 65536, 00:21:23.802 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:23.802 "assigned_rate_limits": { 00:21:23.802 "rw_ios_per_sec": 0, 00:21:23.802 "rw_mbytes_per_sec": 0, 00:21:23.802 "r_mbytes_per_sec": 0, 00:21:23.802 "w_mbytes_per_sec": 0 00:21:23.802 }, 00:21:23.802 "claimed": false, 00:21:23.802 "zoned": false, 00:21:23.802 "supported_io_types": { 00:21:23.802 "read": true, 00:21:23.802 "write": true, 00:21:23.802 "unmap": true, 00:21:23.802 "flush": true, 00:21:23.802 "reset": true, 00:21:23.802 "nvme_admin": false, 00:21:23.802 "nvme_io": false, 00:21:23.802 "nvme_io_md": false, 00:21:23.802 "write_zeroes": true, 00:21:23.802 "zcopy": true, 00:21:23.802 "get_zone_info": false, 00:21:23.802 "zone_management": false, 00:21:23.802 "zone_append": false, 00:21:23.802 "compare": false, 00:21:23.802 "compare_and_write": false, 00:21:23.802 "abort": true, 00:21:23.802 "seek_hole": false, 00:21:23.802 "seek_data": false, 00:21:23.802 "copy": true, 00:21:23.802 "nvme_iov_md": false 00:21:23.802 }, 00:21:23.802 "memory_domains": [ 00:21:23.802 { 00:21:23.802 "dma_device_id": "system", 00:21:23.802 "dma_device_type": 1 00:21:23.802 }, 00:21:23.802 { 00:21:23.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.802 "dma_device_type": 2 00:21:23.802 } 00:21:23.802 ], 00:21:23.802 "driver_specific": {} 00:21:23.802 } 00:21:23.802 ] 00:21:23.802 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:23.802 17:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:23.802 17:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:23.802 17:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:24.371 BaseBdev3 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:24.371 17:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:24.939 17:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:25.198 [ 00:21:25.198 { 00:21:25.198 "name": "BaseBdev3", 00:21:25.198 "aliases": [ 00:21:25.198 "d10c046c-1df8-4c43-a546-2de627b1c39b" 00:21:25.198 ], 00:21:25.198 "product_name": "Malloc disk", 00:21:25.198 "block_size": 512, 00:21:25.198 "num_blocks": 65536, 00:21:25.198 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:25.198 "assigned_rate_limits": { 00:21:25.198 "rw_ios_per_sec": 0, 00:21:25.198 "rw_mbytes_per_sec": 0, 00:21:25.198 "r_mbytes_per_sec": 0, 00:21:25.198 "w_mbytes_per_sec": 0 00:21:25.198 }, 00:21:25.198 "claimed": false, 00:21:25.198 "zoned": false, 00:21:25.198 "supported_io_types": { 00:21:25.198 "read": true, 00:21:25.198 "write": true, 00:21:25.198 "unmap": true, 00:21:25.198 "flush": true, 00:21:25.198 "reset": true, 00:21:25.198 "nvme_admin": false, 00:21:25.198 "nvme_io": false, 00:21:25.198 "nvme_io_md": false, 00:21:25.198 "write_zeroes": true, 00:21:25.198 "zcopy": true, 00:21:25.198 "get_zone_info": false, 00:21:25.198 "zone_management": false, 00:21:25.198 "zone_append": false, 00:21:25.198 "compare": false, 00:21:25.198 "compare_and_write": false, 00:21:25.198 "abort": true, 00:21:25.198 "seek_hole": false, 00:21:25.198 "seek_data": false, 00:21:25.198 "copy": true, 00:21:25.198 "nvme_iov_md": false 00:21:25.198 }, 00:21:25.198 "memory_domains": [ 00:21:25.198 { 00:21:25.198 "dma_device_id": "system", 00:21:25.198 "dma_device_type": 1 00:21:25.198 }, 00:21:25.198 { 00:21:25.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:25.198 "dma_device_type": 2 00:21:25.198 } 00:21:25.198 ], 00:21:25.198 "driver_specific": {} 00:21:25.198 } 00:21:25.198 ] 00:21:25.198 17:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:25.198 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:25.198 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:25.198 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:21:25.766 [2024-07-23 17:15:20.943615] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:25.766 [2024-07-23 17:15:20.943659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:25.766 [2024-07-23 17:15:20.943677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:25.766 [2024-07-23 17:15:20.945004] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.766 17:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:26.025 17:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.025 "name": "Existed_Raid", 00:21:26.025 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:26.025 "strip_size_kb": 0, 00:21:26.025 "state": "configuring", 00:21:26.025 "raid_level": "raid1", 00:21:26.025 "superblock": true, 00:21:26.025 "num_base_bdevs": 3, 00:21:26.025 "num_base_bdevs_discovered": 2, 00:21:26.025 "num_base_bdevs_operational": 3, 00:21:26.025 "base_bdevs_list": [ 00:21:26.025 { 00:21:26.025 "name": "BaseBdev1", 00:21:26.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.025 "is_configured": false, 00:21:26.025 "data_offset": 0, 00:21:26.025 "data_size": 0 00:21:26.025 }, 00:21:26.025 { 00:21:26.025 "name": "BaseBdev2", 00:21:26.025 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:26.025 "is_configured": true, 00:21:26.025 "data_offset": 2048, 00:21:26.025 "data_size": 63488 00:21:26.025 }, 00:21:26.025 { 00:21:26.025 "name": "BaseBdev3", 00:21:26.025 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:26.025 "is_configured": true, 00:21:26.025 "data_offset": 2048, 00:21:26.025 "data_size": 63488 00:21:26.025 } 00:21:26.025 ] 00:21:26.025 }' 00:21:26.025 17:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.025 17:15:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:26.592 17:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:26.592 [2024-07-23 17:15:21.994356] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.851 "name": "Existed_Raid", 00:21:26.851 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:26.851 "strip_size_kb": 0, 00:21:26.851 "state": "configuring", 00:21:26.851 "raid_level": "raid1", 00:21:26.851 "superblock": true, 00:21:26.851 "num_base_bdevs": 3, 00:21:26.851 "num_base_bdevs_discovered": 1, 00:21:26.851 "num_base_bdevs_operational": 3, 00:21:26.851 "base_bdevs_list": [ 00:21:26.851 { 00:21:26.851 "name": "BaseBdev1", 00:21:26.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.851 "is_configured": false, 00:21:26.851 "data_offset": 0, 00:21:26.851 "data_size": 0 00:21:26.851 }, 00:21:26.851 { 00:21:26.851 "name": null, 00:21:26.851 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:26.851 "is_configured": false, 00:21:26.851 "data_offset": 2048, 00:21:26.851 "data_size": 63488 00:21:26.851 }, 00:21:26.851 { 00:21:26.851 "name": "BaseBdev3", 00:21:26.851 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:26.851 "is_configured": true, 00:21:26.851 "data_offset": 2048, 00:21:26.851 "data_size": 63488 00:21:26.851 } 00:21:26.851 ] 00:21:26.851 }' 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.851 17:15:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:27.787 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.787 17:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:27.787 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:27.787 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:28.045 [2024-07-23 17:15:23.402724] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:28.045 BaseBdev1 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:28.045 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:28.304 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:28.563 [ 00:21:28.563 { 00:21:28.563 "name": "BaseBdev1", 00:21:28.563 "aliases": [ 00:21:28.563 "c9756bba-b377-4edb-81a8-8e3d1780d047" 00:21:28.563 ], 00:21:28.563 "product_name": "Malloc disk", 00:21:28.563 "block_size": 512, 00:21:28.563 "num_blocks": 65536, 00:21:28.563 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:28.563 "assigned_rate_limits": { 00:21:28.563 "rw_ios_per_sec": 0, 00:21:28.563 "rw_mbytes_per_sec": 0, 00:21:28.563 "r_mbytes_per_sec": 0, 00:21:28.563 "w_mbytes_per_sec": 0 00:21:28.563 }, 00:21:28.563 "claimed": true, 00:21:28.563 "claim_type": "exclusive_write", 00:21:28.563 "zoned": false, 00:21:28.563 "supported_io_types": { 00:21:28.563 "read": true, 00:21:28.563 "write": true, 00:21:28.563 "unmap": true, 00:21:28.563 "flush": true, 00:21:28.563 "reset": true, 00:21:28.563 "nvme_admin": false, 00:21:28.563 "nvme_io": false, 00:21:28.563 "nvme_io_md": false, 00:21:28.563 "write_zeroes": true, 00:21:28.563 "zcopy": true, 00:21:28.563 "get_zone_info": false, 00:21:28.563 "zone_management": false, 00:21:28.563 "zone_append": false, 00:21:28.563 "compare": false, 00:21:28.563 "compare_and_write": false, 00:21:28.563 "abort": true, 00:21:28.563 "seek_hole": false, 00:21:28.563 "seek_data": false, 00:21:28.563 "copy": true, 00:21:28.563 "nvme_iov_md": false 00:21:28.563 }, 00:21:28.563 "memory_domains": [ 00:21:28.563 { 00:21:28.563 "dma_device_id": "system", 00:21:28.563 "dma_device_type": 1 00:21:28.563 }, 00:21:28.563 { 00:21:28.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:28.563 "dma_device_type": 2 00:21:28.563 } 00:21:28.563 ], 00:21:28.563 "driver_specific": {} 00:21:28.563 } 00:21:28.563 ] 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.563 17:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:28.822 17:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.822 "name": "Existed_Raid", 00:21:28.822 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:28.822 "strip_size_kb": 0, 00:21:28.822 "state": "configuring", 00:21:28.822 "raid_level": "raid1", 00:21:28.822 "superblock": true, 00:21:28.822 "num_base_bdevs": 3, 00:21:28.822 "num_base_bdevs_discovered": 2, 00:21:28.822 "num_base_bdevs_operational": 3, 00:21:28.822 "base_bdevs_list": [ 00:21:28.822 { 00:21:28.822 "name": "BaseBdev1", 00:21:28.822 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:28.822 "is_configured": true, 00:21:28.822 "data_offset": 2048, 00:21:28.822 "data_size": 63488 00:21:28.822 }, 00:21:28.822 { 00:21:28.822 "name": null, 00:21:28.822 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:28.822 "is_configured": false, 00:21:28.822 "data_offset": 2048, 00:21:28.822 "data_size": 63488 00:21:28.822 }, 00:21:28.822 { 00:21:28.822 "name": "BaseBdev3", 00:21:28.822 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:28.822 "is_configured": true, 00:21:28.822 "data_offset": 2048, 00:21:28.822 "data_size": 63488 00:21:28.822 } 00:21:28.822 ] 00:21:28.822 }' 00:21:28.822 17:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.822 17:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:29.389 17:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.389 17:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:29.647 17:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:29.647 17:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:29.906 [2024-07-23 17:15:25.175440] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.906 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.165 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.165 "name": "Existed_Raid", 00:21:30.165 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:30.165 "strip_size_kb": 0, 00:21:30.165 "state": "configuring", 00:21:30.165 "raid_level": "raid1", 00:21:30.165 "superblock": true, 00:21:30.165 "num_base_bdevs": 3, 00:21:30.165 "num_base_bdevs_discovered": 1, 00:21:30.165 "num_base_bdevs_operational": 3, 00:21:30.165 "base_bdevs_list": [ 00:21:30.165 { 00:21:30.165 "name": "BaseBdev1", 00:21:30.165 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:30.165 "is_configured": true, 00:21:30.165 "data_offset": 2048, 00:21:30.165 "data_size": 63488 00:21:30.165 }, 00:21:30.165 { 00:21:30.165 "name": null, 00:21:30.165 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:30.165 "is_configured": false, 00:21:30.165 "data_offset": 2048, 00:21:30.165 "data_size": 63488 00:21:30.165 }, 00:21:30.165 { 00:21:30.165 "name": null, 00:21:30.165 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:30.165 "is_configured": false, 00:21:30.165 "data_offset": 2048, 00:21:30.165 "data_size": 63488 00:21:30.165 } 00:21:30.165 ] 00:21:30.165 }' 00:21:30.165 17:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.165 17:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:30.787 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.787 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:31.355 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:31.355 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:31.615 [2024-07-23 17:15:26.803792] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.615 17:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:31.874 17:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.874 "name": "Existed_Raid", 00:21:31.874 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:31.874 "strip_size_kb": 0, 00:21:31.874 "state": "configuring", 00:21:31.874 "raid_level": "raid1", 00:21:31.874 "superblock": true, 00:21:31.874 "num_base_bdevs": 3, 00:21:31.874 "num_base_bdevs_discovered": 2, 00:21:31.874 "num_base_bdevs_operational": 3, 00:21:31.874 "base_bdevs_list": [ 00:21:31.874 { 00:21:31.874 "name": "BaseBdev1", 00:21:31.874 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:31.874 "is_configured": true, 00:21:31.874 "data_offset": 2048, 00:21:31.874 "data_size": 63488 00:21:31.874 }, 00:21:31.874 { 00:21:31.874 "name": null, 00:21:31.874 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:31.874 "is_configured": false, 00:21:31.874 "data_offset": 2048, 00:21:31.874 "data_size": 63488 00:21:31.874 }, 00:21:31.874 { 00:21:31.874 "name": "BaseBdev3", 00:21:31.874 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:31.874 "is_configured": true, 00:21:31.874 "data_offset": 2048, 00:21:31.875 "data_size": 63488 00:21:31.875 } 00:21:31.875 ] 00:21:31.875 }' 00:21:31.875 17:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.875 17:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:32.442 17:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.442 17:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:32.700 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:32.700 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:32.959 [2024-07-23 17:15:28.243610] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.959 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.217 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.218 "name": "Existed_Raid", 00:21:33.218 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:33.218 "strip_size_kb": 0, 00:21:33.218 "state": "configuring", 00:21:33.218 "raid_level": "raid1", 00:21:33.218 "superblock": true, 00:21:33.218 "num_base_bdevs": 3, 00:21:33.218 "num_base_bdevs_discovered": 1, 00:21:33.218 "num_base_bdevs_operational": 3, 00:21:33.218 "base_bdevs_list": [ 00:21:33.218 { 00:21:33.218 "name": null, 00:21:33.218 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:33.218 "is_configured": false, 00:21:33.218 "data_offset": 2048, 00:21:33.218 "data_size": 63488 00:21:33.218 }, 00:21:33.218 { 00:21:33.218 "name": null, 00:21:33.218 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:33.218 "is_configured": false, 00:21:33.218 "data_offset": 2048, 00:21:33.218 "data_size": 63488 00:21:33.218 }, 00:21:33.218 { 00:21:33.218 "name": "BaseBdev3", 00:21:33.218 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:33.218 "is_configured": true, 00:21:33.218 "data_offset": 2048, 00:21:33.218 "data_size": 63488 00:21:33.218 } 00:21:33.218 ] 00:21:33.218 }' 00:21:33.218 17:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.218 17:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:34.154 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.154 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:34.154 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:34.154 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:34.414 [2024-07-23 17:15:29.690123] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.414 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:34.673 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.673 "name": "Existed_Raid", 00:21:34.673 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:34.673 "strip_size_kb": 0, 00:21:34.673 "state": "configuring", 00:21:34.673 "raid_level": "raid1", 00:21:34.673 "superblock": true, 00:21:34.673 "num_base_bdevs": 3, 00:21:34.673 "num_base_bdevs_discovered": 2, 00:21:34.673 "num_base_bdevs_operational": 3, 00:21:34.673 "base_bdevs_list": [ 00:21:34.673 { 00:21:34.673 "name": null, 00:21:34.673 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:34.673 "is_configured": false, 00:21:34.673 "data_offset": 2048, 00:21:34.673 "data_size": 63488 00:21:34.673 }, 00:21:34.673 { 00:21:34.673 "name": "BaseBdev2", 00:21:34.673 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:34.673 "is_configured": true, 00:21:34.673 "data_offset": 2048, 00:21:34.673 "data_size": 63488 00:21:34.673 }, 00:21:34.673 { 00:21:34.673 "name": "BaseBdev3", 00:21:34.673 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:34.673 "is_configured": true, 00:21:34.673 "data_offset": 2048, 00:21:34.673 "data_size": 63488 00:21:34.673 } 00:21:34.673 ] 00:21:34.673 }' 00:21:34.673 17:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.673 17:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:35.239 17:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.239 17:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:35.498 17:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:35.498 17:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.498 17:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:35.757 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c9756bba-b377-4edb-81a8-8e3d1780d047 00:21:36.015 [2024-07-23 17:15:31.282957] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:36.015 [2024-07-23 17:15:31.283114] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11b4b60 00:21:36.015 [2024-07-23 17:15:31.283127] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:36.015 [2024-07-23 17:15:31.283298] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11b5440 00:21:36.015 [2024-07-23 17:15:31.283416] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11b4b60 00:21:36.015 [2024-07-23 17:15:31.283426] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11b4b60 00:21:36.015 [2024-07-23 17:15:31.283520] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:36.015 NewBaseBdev 00:21:36.015 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:36.015 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:36.015 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:36.015 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:36.015 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:36.015 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:36.016 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:36.274 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:36.533 [ 00:21:36.533 { 00:21:36.533 "name": "NewBaseBdev", 00:21:36.533 "aliases": [ 00:21:36.533 "c9756bba-b377-4edb-81a8-8e3d1780d047" 00:21:36.533 ], 00:21:36.533 "product_name": "Malloc disk", 00:21:36.533 "block_size": 512, 00:21:36.533 "num_blocks": 65536, 00:21:36.533 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:36.533 "assigned_rate_limits": { 00:21:36.533 "rw_ios_per_sec": 0, 00:21:36.533 "rw_mbytes_per_sec": 0, 00:21:36.533 "r_mbytes_per_sec": 0, 00:21:36.533 "w_mbytes_per_sec": 0 00:21:36.533 }, 00:21:36.533 "claimed": true, 00:21:36.533 "claim_type": "exclusive_write", 00:21:36.533 "zoned": false, 00:21:36.533 "supported_io_types": { 00:21:36.533 "read": true, 00:21:36.533 "write": true, 00:21:36.533 "unmap": true, 00:21:36.533 "flush": true, 00:21:36.533 "reset": true, 00:21:36.533 "nvme_admin": false, 00:21:36.533 "nvme_io": false, 00:21:36.533 "nvme_io_md": false, 00:21:36.533 "write_zeroes": true, 00:21:36.533 "zcopy": true, 00:21:36.533 "get_zone_info": false, 00:21:36.533 "zone_management": false, 00:21:36.533 "zone_append": false, 00:21:36.533 "compare": false, 00:21:36.533 "compare_and_write": false, 00:21:36.533 "abort": true, 00:21:36.533 "seek_hole": false, 00:21:36.533 "seek_data": false, 00:21:36.533 "copy": true, 00:21:36.533 "nvme_iov_md": false 00:21:36.533 }, 00:21:36.533 "memory_domains": [ 00:21:36.533 { 00:21:36.533 "dma_device_id": "system", 00:21:36.533 "dma_device_type": 1 00:21:36.533 }, 00:21:36.533 { 00:21:36.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.533 "dma_device_type": 2 00:21:36.533 } 00:21:36.533 ], 00:21:36.533 "driver_specific": {} 00:21:36.533 } 00:21:36.533 ] 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.533 17:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.792 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.792 "name": "Existed_Raid", 00:21:36.792 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:36.792 "strip_size_kb": 0, 00:21:36.792 "state": "online", 00:21:36.792 "raid_level": "raid1", 00:21:36.792 "superblock": true, 00:21:36.792 "num_base_bdevs": 3, 00:21:36.792 "num_base_bdevs_discovered": 3, 00:21:36.792 "num_base_bdevs_operational": 3, 00:21:36.792 "base_bdevs_list": [ 00:21:36.792 { 00:21:36.792 "name": "NewBaseBdev", 00:21:36.792 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:36.792 "is_configured": true, 00:21:36.792 "data_offset": 2048, 00:21:36.792 "data_size": 63488 00:21:36.792 }, 00:21:36.792 { 00:21:36.792 "name": "BaseBdev2", 00:21:36.792 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:36.792 "is_configured": true, 00:21:36.792 "data_offset": 2048, 00:21:36.792 "data_size": 63488 00:21:36.792 }, 00:21:36.792 { 00:21:36.792 "name": "BaseBdev3", 00:21:36.792 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:36.792 "is_configured": true, 00:21:36.792 "data_offset": 2048, 00:21:36.792 "data_size": 63488 00:21:36.792 } 00:21:36.792 ] 00:21:36.792 }' 00:21:36.792 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.792 17:15:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:37.360 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:37.618 [2024-07-23 17:15:32.955725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:37.618 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:37.618 "name": "Existed_Raid", 00:21:37.618 "aliases": [ 00:21:37.618 "ca528785-6547-4a3b-ae8e-c97d1d63095f" 00:21:37.618 ], 00:21:37.618 "product_name": "Raid Volume", 00:21:37.618 "block_size": 512, 00:21:37.618 "num_blocks": 63488, 00:21:37.618 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:37.618 "assigned_rate_limits": { 00:21:37.618 "rw_ios_per_sec": 0, 00:21:37.618 "rw_mbytes_per_sec": 0, 00:21:37.618 "r_mbytes_per_sec": 0, 00:21:37.618 "w_mbytes_per_sec": 0 00:21:37.618 }, 00:21:37.618 "claimed": false, 00:21:37.618 "zoned": false, 00:21:37.618 "supported_io_types": { 00:21:37.618 "read": true, 00:21:37.618 "write": true, 00:21:37.618 "unmap": false, 00:21:37.618 "flush": false, 00:21:37.618 "reset": true, 00:21:37.618 "nvme_admin": false, 00:21:37.618 "nvme_io": false, 00:21:37.618 "nvme_io_md": false, 00:21:37.618 "write_zeroes": true, 00:21:37.618 "zcopy": false, 00:21:37.618 "get_zone_info": false, 00:21:37.618 "zone_management": false, 00:21:37.618 "zone_append": false, 00:21:37.618 "compare": false, 00:21:37.618 "compare_and_write": false, 00:21:37.618 "abort": false, 00:21:37.618 "seek_hole": false, 00:21:37.618 "seek_data": false, 00:21:37.618 "copy": false, 00:21:37.618 "nvme_iov_md": false 00:21:37.618 }, 00:21:37.618 "memory_domains": [ 00:21:37.618 { 00:21:37.618 "dma_device_id": "system", 00:21:37.618 "dma_device_type": 1 00:21:37.618 }, 00:21:37.618 { 00:21:37.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.618 "dma_device_type": 2 00:21:37.618 }, 00:21:37.618 { 00:21:37.618 "dma_device_id": "system", 00:21:37.618 "dma_device_type": 1 00:21:37.618 }, 00:21:37.618 { 00:21:37.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.618 "dma_device_type": 2 00:21:37.618 }, 00:21:37.618 { 00:21:37.618 "dma_device_id": "system", 00:21:37.618 "dma_device_type": 1 00:21:37.618 }, 00:21:37.618 { 00:21:37.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.618 "dma_device_type": 2 00:21:37.618 } 00:21:37.618 ], 00:21:37.618 "driver_specific": { 00:21:37.618 "raid": { 00:21:37.618 "uuid": "ca528785-6547-4a3b-ae8e-c97d1d63095f", 00:21:37.618 "strip_size_kb": 0, 00:21:37.618 "state": "online", 00:21:37.619 "raid_level": "raid1", 00:21:37.619 "superblock": true, 00:21:37.619 "num_base_bdevs": 3, 00:21:37.619 "num_base_bdevs_discovered": 3, 00:21:37.619 "num_base_bdevs_operational": 3, 00:21:37.619 "base_bdevs_list": [ 00:21:37.619 { 00:21:37.619 "name": "NewBaseBdev", 00:21:37.619 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:37.619 "is_configured": true, 00:21:37.619 "data_offset": 2048, 00:21:37.619 "data_size": 63488 00:21:37.619 }, 00:21:37.619 { 00:21:37.619 "name": "BaseBdev2", 00:21:37.619 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:37.619 "is_configured": true, 00:21:37.619 "data_offset": 2048, 00:21:37.619 "data_size": 63488 00:21:37.619 }, 00:21:37.619 { 00:21:37.619 "name": "BaseBdev3", 00:21:37.619 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:37.619 "is_configured": true, 00:21:37.619 "data_offset": 2048, 00:21:37.619 "data_size": 63488 00:21:37.619 } 00:21:37.619 ] 00:21:37.619 } 00:21:37.619 } 00:21:37.619 }' 00:21:37.619 17:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:37.619 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:37.619 BaseBdev2 00:21:37.619 BaseBdev3' 00:21:37.619 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:37.619 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:37.619 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:37.877 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:37.877 "name": "NewBaseBdev", 00:21:37.877 "aliases": [ 00:21:37.877 "c9756bba-b377-4edb-81a8-8e3d1780d047" 00:21:37.877 ], 00:21:37.877 "product_name": "Malloc disk", 00:21:37.877 "block_size": 512, 00:21:37.877 "num_blocks": 65536, 00:21:37.877 "uuid": "c9756bba-b377-4edb-81a8-8e3d1780d047", 00:21:37.877 "assigned_rate_limits": { 00:21:37.877 "rw_ios_per_sec": 0, 00:21:37.878 "rw_mbytes_per_sec": 0, 00:21:37.878 "r_mbytes_per_sec": 0, 00:21:37.878 "w_mbytes_per_sec": 0 00:21:37.878 }, 00:21:37.878 "claimed": true, 00:21:37.878 "claim_type": "exclusive_write", 00:21:37.878 "zoned": false, 00:21:37.878 "supported_io_types": { 00:21:37.878 "read": true, 00:21:37.878 "write": true, 00:21:37.878 "unmap": true, 00:21:37.878 "flush": true, 00:21:37.878 "reset": true, 00:21:37.878 "nvme_admin": false, 00:21:37.878 "nvme_io": false, 00:21:37.878 "nvme_io_md": false, 00:21:37.878 "write_zeroes": true, 00:21:37.878 "zcopy": true, 00:21:37.878 "get_zone_info": false, 00:21:37.878 "zone_management": false, 00:21:37.878 "zone_append": false, 00:21:37.878 "compare": false, 00:21:37.878 "compare_and_write": false, 00:21:37.878 "abort": true, 00:21:37.878 "seek_hole": false, 00:21:37.878 "seek_data": false, 00:21:37.878 "copy": true, 00:21:37.878 "nvme_iov_md": false 00:21:37.878 }, 00:21:37.878 "memory_domains": [ 00:21:37.878 { 00:21:37.878 "dma_device_id": "system", 00:21:37.878 "dma_device_type": 1 00:21:37.878 }, 00:21:37.878 { 00:21:37.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.878 "dma_device_type": 2 00:21:37.878 } 00:21:37.878 ], 00:21:37.878 "driver_specific": {} 00:21:37.878 }' 00:21:37.878 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:38.138 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.397 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.397 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:38.397 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:38.397 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:38.398 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:38.656 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:38.656 "name": "BaseBdev2", 00:21:38.656 "aliases": [ 00:21:38.656 "6af1ebb7-764a-4923-82e4-64df4419c2b5" 00:21:38.656 ], 00:21:38.656 "product_name": "Malloc disk", 00:21:38.656 "block_size": 512, 00:21:38.656 "num_blocks": 65536, 00:21:38.656 "uuid": "6af1ebb7-764a-4923-82e4-64df4419c2b5", 00:21:38.656 "assigned_rate_limits": { 00:21:38.656 "rw_ios_per_sec": 0, 00:21:38.656 "rw_mbytes_per_sec": 0, 00:21:38.656 "r_mbytes_per_sec": 0, 00:21:38.656 "w_mbytes_per_sec": 0 00:21:38.656 }, 00:21:38.656 "claimed": true, 00:21:38.656 "claim_type": "exclusive_write", 00:21:38.656 "zoned": false, 00:21:38.656 "supported_io_types": { 00:21:38.656 "read": true, 00:21:38.656 "write": true, 00:21:38.656 "unmap": true, 00:21:38.656 "flush": true, 00:21:38.656 "reset": true, 00:21:38.656 "nvme_admin": false, 00:21:38.656 "nvme_io": false, 00:21:38.656 "nvme_io_md": false, 00:21:38.656 "write_zeroes": true, 00:21:38.656 "zcopy": true, 00:21:38.656 "get_zone_info": false, 00:21:38.656 "zone_management": false, 00:21:38.656 "zone_append": false, 00:21:38.656 "compare": false, 00:21:38.656 "compare_and_write": false, 00:21:38.656 "abort": true, 00:21:38.656 "seek_hole": false, 00:21:38.656 "seek_data": false, 00:21:38.656 "copy": true, 00:21:38.656 "nvme_iov_md": false 00:21:38.656 }, 00:21:38.656 "memory_domains": [ 00:21:38.656 { 00:21:38.656 "dma_device_id": "system", 00:21:38.656 "dma_device_type": 1 00:21:38.656 }, 00:21:38.656 { 00:21:38.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.656 "dma_device_type": 2 00:21:38.656 } 00:21:38.656 ], 00:21:38.656 "driver_specific": {} 00:21:38.656 }' 00:21:38.656 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.656 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.656 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:38.656 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.656 17:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.656 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:38.656 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.656 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:38.915 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:39.173 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:39.173 "name": "BaseBdev3", 00:21:39.173 "aliases": [ 00:21:39.173 "d10c046c-1df8-4c43-a546-2de627b1c39b" 00:21:39.173 ], 00:21:39.173 "product_name": "Malloc disk", 00:21:39.173 "block_size": 512, 00:21:39.173 "num_blocks": 65536, 00:21:39.173 "uuid": "d10c046c-1df8-4c43-a546-2de627b1c39b", 00:21:39.173 "assigned_rate_limits": { 00:21:39.173 "rw_ios_per_sec": 0, 00:21:39.173 "rw_mbytes_per_sec": 0, 00:21:39.173 "r_mbytes_per_sec": 0, 00:21:39.173 "w_mbytes_per_sec": 0 00:21:39.173 }, 00:21:39.173 "claimed": true, 00:21:39.173 "claim_type": "exclusive_write", 00:21:39.173 "zoned": false, 00:21:39.173 "supported_io_types": { 00:21:39.173 "read": true, 00:21:39.173 "write": true, 00:21:39.173 "unmap": true, 00:21:39.173 "flush": true, 00:21:39.173 "reset": true, 00:21:39.173 "nvme_admin": false, 00:21:39.173 "nvme_io": false, 00:21:39.173 "nvme_io_md": false, 00:21:39.173 "write_zeroes": true, 00:21:39.173 "zcopy": true, 00:21:39.173 "get_zone_info": false, 00:21:39.173 "zone_management": false, 00:21:39.173 "zone_append": false, 00:21:39.173 "compare": false, 00:21:39.173 "compare_and_write": false, 00:21:39.173 "abort": true, 00:21:39.173 "seek_hole": false, 00:21:39.173 "seek_data": false, 00:21:39.173 "copy": true, 00:21:39.173 "nvme_iov_md": false 00:21:39.173 }, 00:21:39.173 "memory_domains": [ 00:21:39.173 { 00:21:39.173 "dma_device_id": "system", 00:21:39.173 "dma_device_type": 1 00:21:39.173 }, 00:21:39.173 { 00:21:39.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:39.173 "dma_device_type": 2 00:21:39.173 } 00:21:39.173 ], 00:21:39.173 "driver_specific": {} 00:21:39.173 }' 00:21:39.173 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.173 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.173 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:39.173 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:39.431 17:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:39.690 [2024-07-23 17:15:35.036977] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:39.690 [2024-07-23 17:15:35.037005] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:39.690 [2024-07-23 17:15:35.037059] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:39.690 [2024-07-23 17:15:35.037311] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:39.690 [2024-07-23 17:15:35.037323] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b4b60 name Existed_Raid, state offline 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4161422 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4161422 ']' 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4161422 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4161422 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4161422' 00:21:39.690 killing process with pid 4161422 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4161422 00:21:39.690 [2024-07-23 17:15:35.109127] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:39.690 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4161422 00:21:39.949 [2024-07-23 17:15:35.135500] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:39.949 17:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:39.949 00:21:39.949 real 0m31.086s 00:21:39.949 user 0m57.188s 00:21:39.949 sys 0m5.508s 00:21:39.949 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:39.949 17:15:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:39.949 ************************************ 00:21:39.949 END TEST raid_state_function_test_sb 00:21:39.949 ************************************ 00:21:40.208 17:15:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:40.208 17:15:35 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:21:40.208 17:15:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:40.208 17:15:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:40.208 17:15:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:40.208 ************************************ 00:21:40.208 START TEST raid_superblock_test 00:21:40.208 ************************************ 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4166437 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4166437 /var/tmp/spdk-raid.sock 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4166437 ']' 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:40.208 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:40.208 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.208 [2024-07-23 17:15:35.517759] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:21:40.208 [2024-07-23 17:15:35.517904] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4166437 ] 00:21:40.467 [2024-07-23 17:15:35.723675] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:40.467 [2024-07-23 17:15:35.782369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:40.467 [2024-07-23 17:15:35.845837] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:40.467 [2024-07-23 17:15:35.845868] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:40.726 17:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:40.984 malloc1 00:21:40.984 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:41.243 [2024-07-23 17:15:36.408501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:41.243 [2024-07-23 17:15:36.408548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.243 [2024-07-23 17:15:36.408571] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16ce070 00:21:41.243 [2024-07-23 17:15:36.408583] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.243 [2024-07-23 17:15:36.410232] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.243 [2024-07-23 17:15:36.410261] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:41.243 pt1 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:41.243 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:41.243 malloc2 00:21:41.502 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:41.502 [2024-07-23 17:15:36.906650] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:41.502 [2024-07-23 17:15:36.906697] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:41.502 [2024-07-23 17:15:36.906715] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15b4920 00:21:41.502 [2024-07-23 17:15:36.906727] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:41.502 [2024-07-23 17:15:36.908484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:41.502 [2024-07-23 17:15:36.908513] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:41.502 pt2 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:41.761 17:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:41.761 malloc3 00:21:41.761 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:42.019 [2024-07-23 17:15:37.408967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:42.019 [2024-07-23 17:15:37.409011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:42.019 [2024-07-23 17:15:37.409028] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c63e0 00:21:42.019 [2024-07-23 17:15:37.409040] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:42.019 [2024-07-23 17:15:37.410413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:42.019 [2024-07-23 17:15:37.410439] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:42.019 pt3 00:21:42.019 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:42.019 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:42.019 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:21:42.278 [2024-07-23 17:15:37.665828] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:42.278 [2024-07-23 17:15:37.667178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:42.278 [2024-07-23 17:15:37.667233] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:42.278 [2024-07-23 17:15:37.667385] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c8870 00:21:42.278 [2024-07-23 17:15:37.667396] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:42.278 [2024-07-23 17:15:37.667592] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15335e0 00:21:42.278 [2024-07-23 17:15:37.667748] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c8870 00:21:42.278 [2024-07-23 17:15:37.667758] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c8870 00:21:42.278 [2024-07-23 17:15:37.667860] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.278 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.537 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.537 "name": "raid_bdev1", 00:21:42.537 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:42.537 "strip_size_kb": 0, 00:21:42.537 "state": "online", 00:21:42.537 "raid_level": "raid1", 00:21:42.537 "superblock": true, 00:21:42.537 "num_base_bdevs": 3, 00:21:42.537 "num_base_bdevs_discovered": 3, 00:21:42.537 "num_base_bdevs_operational": 3, 00:21:42.537 "base_bdevs_list": [ 00:21:42.537 { 00:21:42.537 "name": "pt1", 00:21:42.537 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:42.537 "is_configured": true, 00:21:42.537 "data_offset": 2048, 00:21:42.537 "data_size": 63488 00:21:42.537 }, 00:21:42.537 { 00:21:42.537 "name": "pt2", 00:21:42.537 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:42.537 "is_configured": true, 00:21:42.537 "data_offset": 2048, 00:21:42.537 "data_size": 63488 00:21:42.537 }, 00:21:42.537 { 00:21:42.537 "name": "pt3", 00:21:42.537 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:42.537 "is_configured": true, 00:21:42.537 "data_offset": 2048, 00:21:42.537 "data_size": 63488 00:21:42.537 } 00:21:42.537 ] 00:21:42.537 }' 00:21:42.537 17:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.537 17:15:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:43.105 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:43.363 [2024-07-23 17:15:38.736885] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:43.363 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:43.363 "name": "raid_bdev1", 00:21:43.363 "aliases": [ 00:21:43.363 "5a4a5eec-1f4b-4288-9fc8-12db73524c36" 00:21:43.363 ], 00:21:43.363 "product_name": "Raid Volume", 00:21:43.363 "block_size": 512, 00:21:43.363 "num_blocks": 63488, 00:21:43.363 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:43.363 "assigned_rate_limits": { 00:21:43.363 "rw_ios_per_sec": 0, 00:21:43.363 "rw_mbytes_per_sec": 0, 00:21:43.363 "r_mbytes_per_sec": 0, 00:21:43.363 "w_mbytes_per_sec": 0 00:21:43.363 }, 00:21:43.363 "claimed": false, 00:21:43.363 "zoned": false, 00:21:43.363 "supported_io_types": { 00:21:43.363 "read": true, 00:21:43.363 "write": true, 00:21:43.363 "unmap": false, 00:21:43.363 "flush": false, 00:21:43.363 "reset": true, 00:21:43.363 "nvme_admin": false, 00:21:43.363 "nvme_io": false, 00:21:43.363 "nvme_io_md": false, 00:21:43.363 "write_zeroes": true, 00:21:43.363 "zcopy": false, 00:21:43.363 "get_zone_info": false, 00:21:43.363 "zone_management": false, 00:21:43.363 "zone_append": false, 00:21:43.363 "compare": false, 00:21:43.363 "compare_and_write": false, 00:21:43.363 "abort": false, 00:21:43.363 "seek_hole": false, 00:21:43.363 "seek_data": false, 00:21:43.363 "copy": false, 00:21:43.363 "nvme_iov_md": false 00:21:43.363 }, 00:21:43.363 "memory_domains": [ 00:21:43.363 { 00:21:43.363 "dma_device_id": "system", 00:21:43.363 "dma_device_type": 1 00:21:43.363 }, 00:21:43.363 { 00:21:43.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.363 "dma_device_type": 2 00:21:43.363 }, 00:21:43.363 { 00:21:43.363 "dma_device_id": "system", 00:21:43.363 "dma_device_type": 1 00:21:43.363 }, 00:21:43.363 { 00:21:43.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.363 "dma_device_type": 2 00:21:43.363 }, 00:21:43.363 { 00:21:43.363 "dma_device_id": "system", 00:21:43.363 "dma_device_type": 1 00:21:43.363 }, 00:21:43.364 { 00:21:43.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.364 "dma_device_type": 2 00:21:43.364 } 00:21:43.364 ], 00:21:43.364 "driver_specific": { 00:21:43.364 "raid": { 00:21:43.364 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:43.364 "strip_size_kb": 0, 00:21:43.364 "state": "online", 00:21:43.364 "raid_level": "raid1", 00:21:43.364 "superblock": true, 00:21:43.364 "num_base_bdevs": 3, 00:21:43.364 "num_base_bdevs_discovered": 3, 00:21:43.364 "num_base_bdevs_operational": 3, 00:21:43.364 "base_bdevs_list": [ 00:21:43.364 { 00:21:43.364 "name": "pt1", 00:21:43.364 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:43.364 "is_configured": true, 00:21:43.364 "data_offset": 2048, 00:21:43.364 "data_size": 63488 00:21:43.364 }, 00:21:43.364 { 00:21:43.364 "name": "pt2", 00:21:43.364 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:43.364 "is_configured": true, 00:21:43.364 "data_offset": 2048, 00:21:43.364 "data_size": 63488 00:21:43.364 }, 00:21:43.364 { 00:21:43.364 "name": "pt3", 00:21:43.364 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:43.364 "is_configured": true, 00:21:43.364 "data_offset": 2048, 00:21:43.364 "data_size": 63488 00:21:43.364 } 00:21:43.364 ] 00:21:43.364 } 00:21:43.364 } 00:21:43.364 }' 00:21:43.364 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:43.622 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:43.622 pt2 00:21:43.622 pt3' 00:21:43.622 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:43.622 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:43.622 17:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:43.881 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:43.881 "name": "pt1", 00:21:43.881 "aliases": [ 00:21:43.881 "00000000-0000-0000-0000-000000000001" 00:21:43.881 ], 00:21:43.881 "product_name": "passthru", 00:21:43.881 "block_size": 512, 00:21:43.881 "num_blocks": 65536, 00:21:43.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:43.881 "assigned_rate_limits": { 00:21:43.881 "rw_ios_per_sec": 0, 00:21:43.881 "rw_mbytes_per_sec": 0, 00:21:43.881 "r_mbytes_per_sec": 0, 00:21:43.881 "w_mbytes_per_sec": 0 00:21:43.881 }, 00:21:43.881 "claimed": true, 00:21:43.881 "claim_type": "exclusive_write", 00:21:43.881 "zoned": false, 00:21:43.881 "supported_io_types": { 00:21:43.881 "read": true, 00:21:43.881 "write": true, 00:21:43.881 "unmap": true, 00:21:43.881 "flush": true, 00:21:43.881 "reset": true, 00:21:43.881 "nvme_admin": false, 00:21:43.881 "nvme_io": false, 00:21:43.881 "nvme_io_md": false, 00:21:43.881 "write_zeroes": true, 00:21:43.882 "zcopy": true, 00:21:43.882 "get_zone_info": false, 00:21:43.882 "zone_management": false, 00:21:43.882 "zone_append": false, 00:21:43.882 "compare": false, 00:21:43.882 "compare_and_write": false, 00:21:43.882 "abort": true, 00:21:43.882 "seek_hole": false, 00:21:43.882 "seek_data": false, 00:21:43.882 "copy": true, 00:21:43.882 "nvme_iov_md": false 00:21:43.882 }, 00:21:43.882 "memory_domains": [ 00:21:43.882 { 00:21:43.882 "dma_device_id": "system", 00:21:43.882 "dma_device_type": 1 00:21:43.882 }, 00:21:43.882 { 00:21:43.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.882 "dma_device_type": 2 00:21:43.882 } 00:21:43.882 ], 00:21:43.882 "driver_specific": { 00:21:43.882 "passthru": { 00:21:43.882 "name": "pt1", 00:21:43.882 "base_bdev_name": "malloc1" 00:21:43.882 } 00:21:43.882 } 00:21:43.882 }' 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:43.882 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:44.140 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.399 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.399 "name": "pt2", 00:21:44.399 "aliases": [ 00:21:44.399 "00000000-0000-0000-0000-000000000002" 00:21:44.399 ], 00:21:44.399 "product_name": "passthru", 00:21:44.399 "block_size": 512, 00:21:44.399 "num_blocks": 65536, 00:21:44.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:44.399 "assigned_rate_limits": { 00:21:44.399 "rw_ios_per_sec": 0, 00:21:44.399 "rw_mbytes_per_sec": 0, 00:21:44.399 "r_mbytes_per_sec": 0, 00:21:44.399 "w_mbytes_per_sec": 0 00:21:44.399 }, 00:21:44.399 "claimed": true, 00:21:44.399 "claim_type": "exclusive_write", 00:21:44.399 "zoned": false, 00:21:44.399 "supported_io_types": { 00:21:44.399 "read": true, 00:21:44.399 "write": true, 00:21:44.399 "unmap": true, 00:21:44.399 "flush": true, 00:21:44.399 "reset": true, 00:21:44.399 "nvme_admin": false, 00:21:44.399 "nvme_io": false, 00:21:44.399 "nvme_io_md": false, 00:21:44.399 "write_zeroes": true, 00:21:44.399 "zcopy": true, 00:21:44.399 "get_zone_info": false, 00:21:44.399 "zone_management": false, 00:21:44.399 "zone_append": false, 00:21:44.399 "compare": false, 00:21:44.399 "compare_and_write": false, 00:21:44.399 "abort": true, 00:21:44.399 "seek_hole": false, 00:21:44.399 "seek_data": false, 00:21:44.399 "copy": true, 00:21:44.399 "nvme_iov_md": false 00:21:44.399 }, 00:21:44.399 "memory_domains": [ 00:21:44.399 { 00:21:44.399 "dma_device_id": "system", 00:21:44.399 "dma_device_type": 1 00:21:44.399 }, 00:21:44.399 { 00:21:44.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.399 "dma_device_type": 2 00:21:44.399 } 00:21:44.399 ], 00:21:44.399 "driver_specific": { 00:21:44.399 "passthru": { 00:21:44.399 "name": "pt2", 00:21:44.399 "base_bdev_name": "malloc2" 00:21:44.399 } 00:21:44.399 } 00:21:44.399 }' 00:21:44.399 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.399 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.400 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:44.400 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.400 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.659 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:44.659 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.659 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.659 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:44.659 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.659 17:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.659 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:44.659 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.659 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:44.659 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.918 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.918 "name": "pt3", 00:21:44.918 "aliases": [ 00:21:44.918 "00000000-0000-0000-0000-000000000003" 00:21:44.918 ], 00:21:44.918 "product_name": "passthru", 00:21:44.918 "block_size": 512, 00:21:44.918 "num_blocks": 65536, 00:21:44.918 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:44.918 "assigned_rate_limits": { 00:21:44.918 "rw_ios_per_sec": 0, 00:21:44.918 "rw_mbytes_per_sec": 0, 00:21:44.918 "r_mbytes_per_sec": 0, 00:21:44.918 "w_mbytes_per_sec": 0 00:21:44.918 }, 00:21:44.918 "claimed": true, 00:21:44.918 "claim_type": "exclusive_write", 00:21:44.918 "zoned": false, 00:21:44.918 "supported_io_types": { 00:21:44.918 "read": true, 00:21:44.918 "write": true, 00:21:44.918 "unmap": true, 00:21:44.918 "flush": true, 00:21:44.918 "reset": true, 00:21:44.918 "nvme_admin": false, 00:21:44.918 "nvme_io": false, 00:21:44.918 "nvme_io_md": false, 00:21:44.918 "write_zeroes": true, 00:21:44.918 "zcopy": true, 00:21:44.918 "get_zone_info": false, 00:21:44.918 "zone_management": false, 00:21:44.918 "zone_append": false, 00:21:44.918 "compare": false, 00:21:44.918 "compare_and_write": false, 00:21:44.918 "abort": true, 00:21:44.918 "seek_hole": false, 00:21:44.918 "seek_data": false, 00:21:44.918 "copy": true, 00:21:44.918 "nvme_iov_md": false 00:21:44.918 }, 00:21:44.918 "memory_domains": [ 00:21:44.918 { 00:21:44.918 "dma_device_id": "system", 00:21:44.918 "dma_device_type": 1 00:21:44.918 }, 00:21:44.918 { 00:21:44.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.918 "dma_device_type": 2 00:21:44.918 } 00:21:44.918 ], 00:21:44.918 "driver_specific": { 00:21:44.918 "passthru": { 00:21:44.918 "name": "pt3", 00:21:44.918 "base_bdev_name": "malloc3" 00:21:44.918 } 00:21:44.918 } 00:21:44.918 }' 00:21:44.918 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.918 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.177 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.436 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.436 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:45.436 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:45.695 [2024-07-23 17:15:40.862551] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:45.695 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5a4a5eec-1f4b-4288-9fc8-12db73524c36 00:21:45.695 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 5a4a5eec-1f4b-4288-9fc8-12db73524c36 ']' 00:21:45.695 17:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:45.695 [2024-07-23 17:15:41.106929] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:45.695 [2024-07-23 17:15:41.106949] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:45.695 [2024-07-23 17:15:41.106997] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:45.695 [2024-07-23 17:15:41.107064] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:45.695 [2024-07-23 17:15:41.107075] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c8870 name raid_bdev1, state offline 00:21:45.954 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.954 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:45.954 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:45.954 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:45.954 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:45.954 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:46.213 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:46.213 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:46.510 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:46.510 17:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:46.769 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:46.769 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:47.027 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:47.028 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:21:47.286 [2024-07-23 17:15:42.598807] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:47.286 [2024-07-23 17:15:42.600178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:47.286 [2024-07-23 17:15:42.600222] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:47.286 [2024-07-23 17:15:42.600267] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:47.286 [2024-07-23 17:15:42.600305] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:47.286 [2024-07-23 17:15:42.600327] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:47.286 [2024-07-23 17:15:42.600346] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:47.286 [2024-07-23 17:15:42.600357] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b4090 name raid_bdev1, state configuring 00:21:47.286 request: 00:21:47.286 { 00:21:47.286 "name": "raid_bdev1", 00:21:47.286 "raid_level": "raid1", 00:21:47.286 "base_bdevs": [ 00:21:47.286 "malloc1", 00:21:47.286 "malloc2", 00:21:47.286 "malloc3" 00:21:47.286 ], 00:21:47.286 "superblock": false, 00:21:47.286 "method": "bdev_raid_create", 00:21:47.286 "req_id": 1 00:21:47.286 } 00:21:47.286 Got JSON-RPC error response 00:21:47.286 response: 00:21:47.286 { 00:21:47.286 "code": -17, 00:21:47.286 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:47.286 } 00:21:47.286 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:47.286 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:47.286 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:47.286 17:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:47.286 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.286 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:47.545 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:47.545 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:47.545 17:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:47.804 [2024-07-23 17:15:43.088044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:47.804 [2024-07-23 17:15:43.088083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.804 [2024-07-23 17:15:43.088100] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16ce3b0 00:21:47.804 [2024-07-23 17:15:43.088112] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.804 [2024-07-23 17:15:43.089714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.804 [2024-07-23 17:15:43.089743] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:47.804 [2024-07-23 17:15:43.089806] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:47.804 [2024-07-23 17:15:43.089829] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:47.804 pt1 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.804 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:48.064 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.064 "name": "raid_bdev1", 00:21:48.064 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:48.064 "strip_size_kb": 0, 00:21:48.064 "state": "configuring", 00:21:48.064 "raid_level": "raid1", 00:21:48.064 "superblock": true, 00:21:48.064 "num_base_bdevs": 3, 00:21:48.064 "num_base_bdevs_discovered": 1, 00:21:48.064 "num_base_bdevs_operational": 3, 00:21:48.064 "base_bdevs_list": [ 00:21:48.064 { 00:21:48.064 "name": "pt1", 00:21:48.064 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:48.064 "is_configured": true, 00:21:48.064 "data_offset": 2048, 00:21:48.064 "data_size": 63488 00:21:48.064 }, 00:21:48.064 { 00:21:48.064 "name": null, 00:21:48.064 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:48.064 "is_configured": false, 00:21:48.064 "data_offset": 2048, 00:21:48.064 "data_size": 63488 00:21:48.064 }, 00:21:48.064 { 00:21:48.064 "name": null, 00:21:48.064 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:48.064 "is_configured": false, 00:21:48.064 "data_offset": 2048, 00:21:48.064 "data_size": 63488 00:21:48.064 } 00:21:48.064 ] 00:21:48.064 }' 00:21:48.064 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.064 17:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.632 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:21:48.632 17:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:48.891 [2024-07-23 17:15:44.170936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:48.891 [2024-07-23 17:15:44.170983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:48.891 [2024-07-23 17:15:44.171000] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c7860 00:21:48.891 [2024-07-23 17:15:44.171012] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:48.891 [2024-07-23 17:15:44.171344] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:48.891 [2024-07-23 17:15:44.171361] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:48.891 [2024-07-23 17:15:44.171421] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:48.891 [2024-07-23 17:15:44.171439] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:48.891 pt2 00:21:48.891 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:49.149 [2024-07-23 17:15:44.415588] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.150 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.409 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.409 "name": "raid_bdev1", 00:21:49.409 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:49.409 "strip_size_kb": 0, 00:21:49.409 "state": "configuring", 00:21:49.409 "raid_level": "raid1", 00:21:49.409 "superblock": true, 00:21:49.409 "num_base_bdevs": 3, 00:21:49.409 "num_base_bdevs_discovered": 1, 00:21:49.409 "num_base_bdevs_operational": 3, 00:21:49.409 "base_bdevs_list": [ 00:21:49.409 { 00:21:49.409 "name": "pt1", 00:21:49.409 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:49.409 "is_configured": true, 00:21:49.409 "data_offset": 2048, 00:21:49.409 "data_size": 63488 00:21:49.409 }, 00:21:49.409 { 00:21:49.409 "name": null, 00:21:49.409 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:49.409 "is_configured": false, 00:21:49.409 "data_offset": 2048, 00:21:49.409 "data_size": 63488 00:21:49.409 }, 00:21:49.409 { 00:21:49.409 "name": null, 00:21:49.409 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:49.409 "is_configured": false, 00:21:49.409 "data_offset": 2048, 00:21:49.409 "data_size": 63488 00:21:49.409 } 00:21:49.409 ] 00:21:49.409 }' 00:21:49.409 17:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.409 17:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.976 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:49.976 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:49.976 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:50.235 [2024-07-23 17:15:45.502476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:50.235 [2024-07-23 17:15:45.502523] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.235 [2024-07-23 17:15:45.502541] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c5670 00:21:50.235 [2024-07-23 17:15:45.502553] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.235 [2024-07-23 17:15:45.502891] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.235 [2024-07-23 17:15:45.502917] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:50.235 [2024-07-23 17:15:45.502978] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:50.235 [2024-07-23 17:15:45.502997] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:50.235 pt2 00:21:50.235 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:50.235 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:50.235 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:50.494 [2024-07-23 17:15:45.755142] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:50.495 [2024-07-23 17:15:45.755171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.495 [2024-07-23 17:15:45.755187] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c9330 00:21:50.495 [2024-07-23 17:15:45.755198] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.495 [2024-07-23 17:15:45.755475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.495 [2024-07-23 17:15:45.755492] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:50.495 [2024-07-23 17:15:45.755539] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:50.495 [2024-07-23 17:15:45.755555] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:50.495 [2024-07-23 17:15:45.755654] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c7b80 00:21:50.495 [2024-07-23 17:15:45.755664] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:50.495 [2024-07-23 17:15:45.755825] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15335e0 00:21:50.495 [2024-07-23 17:15:45.755963] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c7b80 00:21:50.495 [2024-07-23 17:15:45.755974] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c7b80 00:21:50.495 [2024-07-23 17:15:45.756067] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:50.495 pt3 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.495 17:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.754 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.754 "name": "raid_bdev1", 00:21:50.754 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:50.754 "strip_size_kb": 0, 00:21:50.754 "state": "online", 00:21:50.754 "raid_level": "raid1", 00:21:50.754 "superblock": true, 00:21:50.754 "num_base_bdevs": 3, 00:21:50.754 "num_base_bdevs_discovered": 3, 00:21:50.754 "num_base_bdevs_operational": 3, 00:21:50.754 "base_bdevs_list": [ 00:21:50.754 { 00:21:50.754 "name": "pt1", 00:21:50.754 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:50.754 "is_configured": true, 00:21:50.754 "data_offset": 2048, 00:21:50.754 "data_size": 63488 00:21:50.754 }, 00:21:50.754 { 00:21:50.754 "name": "pt2", 00:21:50.754 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:50.754 "is_configured": true, 00:21:50.754 "data_offset": 2048, 00:21:50.754 "data_size": 63488 00:21:50.754 }, 00:21:50.754 { 00:21:50.754 "name": "pt3", 00:21:50.754 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:50.754 "is_configured": true, 00:21:50.754 "data_offset": 2048, 00:21:50.754 "data_size": 63488 00:21:50.754 } 00:21:50.754 ] 00:21:50.754 }' 00:21:50.754 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.754 17:15:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.321 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:51.321 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:51.322 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:51.322 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:51.322 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:51.322 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:51.322 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:51.322 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:51.581 [2024-07-23 17:15:46.782150] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:51.581 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:51.581 "name": "raid_bdev1", 00:21:51.581 "aliases": [ 00:21:51.581 "5a4a5eec-1f4b-4288-9fc8-12db73524c36" 00:21:51.581 ], 00:21:51.581 "product_name": "Raid Volume", 00:21:51.581 "block_size": 512, 00:21:51.581 "num_blocks": 63488, 00:21:51.581 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:51.581 "assigned_rate_limits": { 00:21:51.581 "rw_ios_per_sec": 0, 00:21:51.581 "rw_mbytes_per_sec": 0, 00:21:51.581 "r_mbytes_per_sec": 0, 00:21:51.581 "w_mbytes_per_sec": 0 00:21:51.581 }, 00:21:51.581 "claimed": false, 00:21:51.581 "zoned": false, 00:21:51.581 "supported_io_types": { 00:21:51.581 "read": true, 00:21:51.581 "write": true, 00:21:51.581 "unmap": false, 00:21:51.581 "flush": false, 00:21:51.581 "reset": true, 00:21:51.581 "nvme_admin": false, 00:21:51.581 "nvme_io": false, 00:21:51.581 "nvme_io_md": false, 00:21:51.581 "write_zeroes": true, 00:21:51.581 "zcopy": false, 00:21:51.581 "get_zone_info": false, 00:21:51.581 "zone_management": false, 00:21:51.581 "zone_append": false, 00:21:51.581 "compare": false, 00:21:51.581 "compare_and_write": false, 00:21:51.581 "abort": false, 00:21:51.581 "seek_hole": false, 00:21:51.581 "seek_data": false, 00:21:51.581 "copy": false, 00:21:51.581 "nvme_iov_md": false 00:21:51.581 }, 00:21:51.581 "memory_domains": [ 00:21:51.581 { 00:21:51.581 "dma_device_id": "system", 00:21:51.581 "dma_device_type": 1 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.581 "dma_device_type": 2 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "dma_device_id": "system", 00:21:51.581 "dma_device_type": 1 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.581 "dma_device_type": 2 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "dma_device_id": "system", 00:21:51.581 "dma_device_type": 1 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.581 "dma_device_type": 2 00:21:51.581 } 00:21:51.581 ], 00:21:51.581 "driver_specific": { 00:21:51.581 "raid": { 00:21:51.581 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:51.581 "strip_size_kb": 0, 00:21:51.581 "state": "online", 00:21:51.581 "raid_level": "raid1", 00:21:51.581 "superblock": true, 00:21:51.581 "num_base_bdevs": 3, 00:21:51.581 "num_base_bdevs_discovered": 3, 00:21:51.581 "num_base_bdevs_operational": 3, 00:21:51.581 "base_bdevs_list": [ 00:21:51.581 { 00:21:51.581 "name": "pt1", 00:21:51.581 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:51.581 "is_configured": true, 00:21:51.581 "data_offset": 2048, 00:21:51.581 "data_size": 63488 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "name": "pt2", 00:21:51.581 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:51.581 "is_configured": true, 00:21:51.581 "data_offset": 2048, 00:21:51.581 "data_size": 63488 00:21:51.581 }, 00:21:51.581 { 00:21:51.581 "name": "pt3", 00:21:51.581 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:51.581 "is_configured": true, 00:21:51.581 "data_offset": 2048, 00:21:51.581 "data_size": 63488 00:21:51.581 } 00:21:51.581 ] 00:21:51.581 } 00:21:51.581 } 00:21:51.581 }' 00:21:51.581 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:51.581 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:51.581 pt2 00:21:51.581 pt3' 00:21:51.581 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:51.581 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:51.581 17:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:51.840 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:51.840 "name": "pt1", 00:21:51.840 "aliases": [ 00:21:51.840 "00000000-0000-0000-0000-000000000001" 00:21:51.840 ], 00:21:51.840 "product_name": "passthru", 00:21:51.840 "block_size": 512, 00:21:51.840 "num_blocks": 65536, 00:21:51.840 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:51.840 "assigned_rate_limits": { 00:21:51.840 "rw_ios_per_sec": 0, 00:21:51.840 "rw_mbytes_per_sec": 0, 00:21:51.840 "r_mbytes_per_sec": 0, 00:21:51.840 "w_mbytes_per_sec": 0 00:21:51.840 }, 00:21:51.840 "claimed": true, 00:21:51.840 "claim_type": "exclusive_write", 00:21:51.840 "zoned": false, 00:21:51.840 "supported_io_types": { 00:21:51.840 "read": true, 00:21:51.840 "write": true, 00:21:51.840 "unmap": true, 00:21:51.840 "flush": true, 00:21:51.840 "reset": true, 00:21:51.840 "nvme_admin": false, 00:21:51.840 "nvme_io": false, 00:21:51.840 "nvme_io_md": false, 00:21:51.840 "write_zeroes": true, 00:21:51.840 "zcopy": true, 00:21:51.840 "get_zone_info": false, 00:21:51.840 "zone_management": false, 00:21:51.840 "zone_append": false, 00:21:51.840 "compare": false, 00:21:51.840 "compare_and_write": false, 00:21:51.840 "abort": true, 00:21:51.840 "seek_hole": false, 00:21:51.840 "seek_data": false, 00:21:51.840 "copy": true, 00:21:51.840 "nvme_iov_md": false 00:21:51.840 }, 00:21:51.840 "memory_domains": [ 00:21:51.840 { 00:21:51.840 "dma_device_id": "system", 00:21:51.840 "dma_device_type": 1 00:21:51.840 }, 00:21:51.840 { 00:21:51.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.840 "dma_device_type": 2 00:21:51.840 } 00:21:51.840 ], 00:21:51.840 "driver_specific": { 00:21:51.840 "passthru": { 00:21:51.840 "name": "pt1", 00:21:51.840 "base_bdev_name": "malloc1" 00:21:51.840 } 00:21:51.840 } 00:21:51.840 }' 00:21:51.840 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.840 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.840 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:51.840 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.840 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:52.099 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:52.358 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:52.358 "name": "pt2", 00:21:52.358 "aliases": [ 00:21:52.358 "00000000-0000-0000-0000-000000000002" 00:21:52.358 ], 00:21:52.358 "product_name": "passthru", 00:21:52.358 "block_size": 512, 00:21:52.358 "num_blocks": 65536, 00:21:52.358 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:52.358 "assigned_rate_limits": { 00:21:52.358 "rw_ios_per_sec": 0, 00:21:52.358 "rw_mbytes_per_sec": 0, 00:21:52.358 "r_mbytes_per_sec": 0, 00:21:52.358 "w_mbytes_per_sec": 0 00:21:52.358 }, 00:21:52.358 "claimed": true, 00:21:52.358 "claim_type": "exclusive_write", 00:21:52.358 "zoned": false, 00:21:52.358 "supported_io_types": { 00:21:52.358 "read": true, 00:21:52.358 "write": true, 00:21:52.358 "unmap": true, 00:21:52.358 "flush": true, 00:21:52.358 "reset": true, 00:21:52.358 "nvme_admin": false, 00:21:52.358 "nvme_io": false, 00:21:52.358 "nvme_io_md": false, 00:21:52.358 "write_zeroes": true, 00:21:52.358 "zcopy": true, 00:21:52.358 "get_zone_info": false, 00:21:52.358 "zone_management": false, 00:21:52.358 "zone_append": false, 00:21:52.358 "compare": false, 00:21:52.358 "compare_and_write": false, 00:21:52.358 "abort": true, 00:21:52.358 "seek_hole": false, 00:21:52.358 "seek_data": false, 00:21:52.358 "copy": true, 00:21:52.358 "nvme_iov_md": false 00:21:52.358 }, 00:21:52.358 "memory_domains": [ 00:21:52.358 { 00:21:52.358 "dma_device_id": "system", 00:21:52.358 "dma_device_type": 1 00:21:52.358 }, 00:21:52.358 { 00:21:52.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.358 "dma_device_type": 2 00:21:52.358 } 00:21:52.358 ], 00:21:52.358 "driver_specific": { 00:21:52.358 "passthru": { 00:21:52.358 "name": "pt2", 00:21:52.358 "base_bdev_name": "malloc2" 00:21:52.358 } 00:21:52.358 } 00:21:52.358 }' 00:21:52.358 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.358 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:52.620 17:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.620 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.879 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:52.879 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:52.879 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:52.879 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:52.879 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:52.879 "name": "pt3", 00:21:52.879 "aliases": [ 00:21:52.879 "00000000-0000-0000-0000-000000000003" 00:21:52.879 ], 00:21:52.879 "product_name": "passthru", 00:21:52.879 "block_size": 512, 00:21:52.879 "num_blocks": 65536, 00:21:52.879 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:52.879 "assigned_rate_limits": { 00:21:52.879 "rw_ios_per_sec": 0, 00:21:52.879 "rw_mbytes_per_sec": 0, 00:21:52.879 "r_mbytes_per_sec": 0, 00:21:52.879 "w_mbytes_per_sec": 0 00:21:52.879 }, 00:21:52.879 "claimed": true, 00:21:52.879 "claim_type": "exclusive_write", 00:21:52.879 "zoned": false, 00:21:52.879 "supported_io_types": { 00:21:52.879 "read": true, 00:21:52.879 "write": true, 00:21:52.879 "unmap": true, 00:21:52.879 "flush": true, 00:21:52.879 "reset": true, 00:21:52.879 "nvme_admin": false, 00:21:52.879 "nvme_io": false, 00:21:52.879 "nvme_io_md": false, 00:21:52.879 "write_zeroes": true, 00:21:52.879 "zcopy": true, 00:21:52.879 "get_zone_info": false, 00:21:52.879 "zone_management": false, 00:21:52.879 "zone_append": false, 00:21:52.879 "compare": false, 00:21:52.879 "compare_and_write": false, 00:21:52.879 "abort": true, 00:21:52.879 "seek_hole": false, 00:21:52.879 "seek_data": false, 00:21:52.879 "copy": true, 00:21:52.879 "nvme_iov_md": false 00:21:52.879 }, 00:21:52.879 "memory_domains": [ 00:21:52.879 { 00:21:52.879 "dma_device_id": "system", 00:21:52.879 "dma_device_type": 1 00:21:52.879 }, 00:21:52.879 { 00:21:52.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.879 "dma_device_type": 2 00:21:52.879 } 00:21:52.879 ], 00:21:52.879 "driver_specific": { 00:21:52.879 "passthru": { 00:21:52.879 "name": "pt3", 00:21:52.879 "base_bdev_name": "malloc3" 00:21:52.879 } 00:21:52.879 } 00:21:52.879 }' 00:21:52.879 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.138 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.397 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.397 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.397 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.397 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.397 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:53.397 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:53.656 [2024-07-23 17:15:48.879710] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:53.656 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 5a4a5eec-1f4b-4288-9fc8-12db73524c36 '!=' 5a4a5eec-1f4b-4288-9fc8-12db73524c36 ']' 00:21:53.656 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:53.656 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:53.656 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:53.656 17:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:53.915 [2024-07-23 17:15:49.128121] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.915 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.174 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.174 "name": "raid_bdev1", 00:21:54.174 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:54.174 "strip_size_kb": 0, 00:21:54.174 "state": "online", 00:21:54.174 "raid_level": "raid1", 00:21:54.174 "superblock": true, 00:21:54.174 "num_base_bdevs": 3, 00:21:54.174 "num_base_bdevs_discovered": 2, 00:21:54.174 "num_base_bdevs_operational": 2, 00:21:54.174 "base_bdevs_list": [ 00:21:54.174 { 00:21:54.174 "name": null, 00:21:54.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.174 "is_configured": false, 00:21:54.174 "data_offset": 2048, 00:21:54.174 "data_size": 63488 00:21:54.174 }, 00:21:54.174 { 00:21:54.174 "name": "pt2", 00:21:54.174 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:54.174 "is_configured": true, 00:21:54.174 "data_offset": 2048, 00:21:54.174 "data_size": 63488 00:21:54.174 }, 00:21:54.174 { 00:21:54.174 "name": "pt3", 00:21:54.174 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:54.174 "is_configured": true, 00:21:54.174 "data_offset": 2048, 00:21:54.174 "data_size": 63488 00:21:54.174 } 00:21:54.174 ] 00:21:54.174 }' 00:21:54.174 17:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.174 17:15:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.741 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:55.000 [2024-07-23 17:15:50.231037] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:55.000 [2024-07-23 17:15:50.231066] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:55.000 [2024-07-23 17:15:50.231115] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:55.000 [2024-07-23 17:15:50.231171] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:55.000 [2024-07-23 17:15:50.231183] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c7b80 name raid_bdev1, state offline 00:21:55.000 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.000 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:55.258 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:55.258 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:55.258 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:55.258 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:55.258 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:55.517 17:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:55.776 [2024-07-23 17:15:51.025097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:55.776 [2024-07-23 17:15:51.025143] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.776 [2024-07-23 17:15:51.025160] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151ca80 00:21:55.776 [2024-07-23 17:15:51.025172] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.776 [2024-07-23 17:15:51.026752] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.776 [2024-07-23 17:15:51.026781] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:55.776 [2024-07-23 17:15:51.026843] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:55.776 [2024-07-23 17:15:51.026867] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:55.776 pt2 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.776 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.035 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.035 "name": "raid_bdev1", 00:21:56.035 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:56.035 "strip_size_kb": 0, 00:21:56.035 "state": "configuring", 00:21:56.035 "raid_level": "raid1", 00:21:56.035 "superblock": true, 00:21:56.035 "num_base_bdevs": 3, 00:21:56.035 "num_base_bdevs_discovered": 1, 00:21:56.035 "num_base_bdevs_operational": 2, 00:21:56.035 "base_bdevs_list": [ 00:21:56.035 { 00:21:56.035 "name": null, 00:21:56.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.035 "is_configured": false, 00:21:56.035 "data_offset": 2048, 00:21:56.035 "data_size": 63488 00:21:56.035 }, 00:21:56.035 { 00:21:56.035 "name": "pt2", 00:21:56.035 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:56.035 "is_configured": true, 00:21:56.035 "data_offset": 2048, 00:21:56.035 "data_size": 63488 00:21:56.035 }, 00:21:56.035 { 00:21:56.035 "name": null, 00:21:56.035 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:56.035 "is_configured": false, 00:21:56.035 "data_offset": 2048, 00:21:56.035 "data_size": 63488 00:21:56.035 } 00:21:56.035 ] 00:21:56.035 }' 00:21:56.035 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.035 17:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.602 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:56.602 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:56.602 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:21:56.602 17:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:56.602 [2024-07-23 17:15:52.007721] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:56.602 [2024-07-23 17:15:52.007765] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.602 [2024-07-23 17:15:52.007783] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16cd2e0 00:21:56.602 [2024-07-23 17:15:52.007795] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.602 [2024-07-23 17:15:52.008130] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.602 [2024-07-23 17:15:52.008148] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:56.602 [2024-07-23 17:15:52.008205] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:56.602 [2024-07-23 17:15:52.008223] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:56.602 [2024-07-23 17:15:52.008317] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16ccc80 00:21:56.602 [2024-07-23 17:15:52.008327] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:56.602 [2024-07-23 17:15:52.008487] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ce300 00:21:56.602 [2024-07-23 17:15:52.008618] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16ccc80 00:21:56.602 [2024-07-23 17:15:52.008628] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16ccc80 00:21:56.602 [2024-07-23 17:15:52.008723] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:56.602 pt3 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.861 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.120 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.120 "name": "raid_bdev1", 00:21:57.120 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:57.120 "strip_size_kb": 0, 00:21:57.120 "state": "online", 00:21:57.120 "raid_level": "raid1", 00:21:57.120 "superblock": true, 00:21:57.120 "num_base_bdevs": 3, 00:21:57.120 "num_base_bdevs_discovered": 2, 00:21:57.120 "num_base_bdevs_operational": 2, 00:21:57.120 "base_bdevs_list": [ 00:21:57.120 { 00:21:57.120 "name": null, 00:21:57.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.120 "is_configured": false, 00:21:57.120 "data_offset": 2048, 00:21:57.120 "data_size": 63488 00:21:57.120 }, 00:21:57.120 { 00:21:57.120 "name": "pt2", 00:21:57.120 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:57.120 "is_configured": true, 00:21:57.120 "data_offset": 2048, 00:21:57.120 "data_size": 63488 00:21:57.120 }, 00:21:57.120 { 00:21:57.120 "name": "pt3", 00:21:57.120 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:57.120 "is_configured": true, 00:21:57.120 "data_offset": 2048, 00:21:57.120 "data_size": 63488 00:21:57.120 } 00:21:57.120 ] 00:21:57.120 }' 00:21:57.120 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.120 17:15:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:57.687 17:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:57.687 [2024-07-23 17:15:52.978277] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:57.687 [2024-07-23 17:15:52.978303] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:57.687 [2024-07-23 17:15:52.978350] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:57.687 [2024-07-23 17:15:52.978400] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:57.687 [2024-07-23 17:15:52.978411] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ccc80 name raid_bdev1, state offline 00:21:57.687 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.687 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:57.946 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:57.946 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:57.946 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:21:57.946 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:21:57.946 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:58.204 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:58.462 [2024-07-23 17:15:53.676087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:58.462 [2024-07-23 17:15:53.676130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.462 [2024-07-23 17:15:53.676147] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151b690 00:21:58.462 [2024-07-23 17:15:53.676159] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.462 [2024-07-23 17:15:53.677714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.462 [2024-07-23 17:15:53.677740] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:58.462 [2024-07-23 17:15:53.677798] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:58.462 [2024-07-23 17:15:53.677821] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:58.462 [2024-07-23 17:15:53.677921] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:58.462 [2024-07-23 17:15:53.677934] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:58.462 [2024-07-23 17:15:53.677947] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151abe0 name raid_bdev1, state configuring 00:21:58.462 [2024-07-23 17:15:53.677969] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:58.462 pt1 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.462 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.721 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.721 "name": "raid_bdev1", 00:21:58.721 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:21:58.721 "strip_size_kb": 0, 00:21:58.721 "state": "configuring", 00:21:58.721 "raid_level": "raid1", 00:21:58.721 "superblock": true, 00:21:58.721 "num_base_bdevs": 3, 00:21:58.721 "num_base_bdevs_discovered": 1, 00:21:58.721 "num_base_bdevs_operational": 2, 00:21:58.721 "base_bdevs_list": [ 00:21:58.721 { 00:21:58.721 "name": null, 00:21:58.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.721 "is_configured": false, 00:21:58.721 "data_offset": 2048, 00:21:58.721 "data_size": 63488 00:21:58.721 }, 00:21:58.721 { 00:21:58.721 "name": "pt2", 00:21:58.721 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:58.721 "is_configured": true, 00:21:58.721 "data_offset": 2048, 00:21:58.721 "data_size": 63488 00:21:58.721 }, 00:21:58.721 { 00:21:58.721 "name": null, 00:21:58.721 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:58.721 "is_configured": false, 00:21:58.721 "data_offset": 2048, 00:21:58.721 "data_size": 63488 00:21:58.721 } 00:21:58.721 ] 00:21:58.721 }' 00:21:58.721 17:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.721 17:15:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.289 17:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:21:59.289 17:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:59.547 17:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:21:59.547 17:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:59.806 [2024-07-23 17:15:55.043719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:59.806 [2024-07-23 17:15:55.043767] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:59.806 [2024-07-23 17:15:55.043785] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16cf850 00:21:59.806 [2024-07-23 17:15:55.043797] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:59.806 [2024-07-23 17:15:55.044135] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:59.806 [2024-07-23 17:15:55.044154] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:59.806 [2024-07-23 17:15:55.044211] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:59.806 [2024-07-23 17:15:55.044230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:59.806 [2024-07-23 17:15:55.044322] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16cd8e0 00:21:59.806 [2024-07-23 17:15:55.044332] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:59.806 [2024-07-23 17:15:55.044503] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16cfae0 00:21:59.806 [2024-07-23 17:15:55.044623] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16cd8e0 00:21:59.806 [2024-07-23 17:15:55.044633] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16cd8e0 00:21:59.806 [2024-07-23 17:15:55.044727] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:59.806 pt3 00:21:59.806 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:59.806 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.806 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.807 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.066 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.066 "name": "raid_bdev1", 00:22:00.066 "uuid": "5a4a5eec-1f4b-4288-9fc8-12db73524c36", 00:22:00.066 "strip_size_kb": 0, 00:22:00.066 "state": "online", 00:22:00.066 "raid_level": "raid1", 00:22:00.066 "superblock": true, 00:22:00.066 "num_base_bdevs": 3, 00:22:00.066 "num_base_bdevs_discovered": 2, 00:22:00.066 "num_base_bdevs_operational": 2, 00:22:00.066 "base_bdevs_list": [ 00:22:00.066 { 00:22:00.066 "name": null, 00:22:00.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.066 "is_configured": false, 00:22:00.066 "data_offset": 2048, 00:22:00.066 "data_size": 63488 00:22:00.066 }, 00:22:00.066 { 00:22:00.066 "name": "pt2", 00:22:00.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:00.066 "is_configured": true, 00:22:00.066 "data_offset": 2048, 00:22:00.066 "data_size": 63488 00:22:00.066 }, 00:22:00.066 { 00:22:00.066 "name": "pt3", 00:22:00.066 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:00.066 "is_configured": true, 00:22:00.066 "data_offset": 2048, 00:22:00.066 "data_size": 63488 00:22:00.066 } 00:22:00.066 ] 00:22:00.066 }' 00:22:00.066 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.066 17:15:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.633 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:00.633 17:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:00.892 17:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:00.892 17:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:00.892 17:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:01.152 [2024-07-23 17:15:56.351535] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:01.152 17:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 5a4a5eec-1f4b-4288-9fc8-12db73524c36 '!=' 5a4a5eec-1f4b-4288-9fc8-12db73524c36 ']' 00:22:01.152 17:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4166437 00:22:01.152 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4166437 ']' 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4166437 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4166437 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4166437' 00:22:01.153 killing process with pid 4166437 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4166437 00:22:01.153 [2024-07-23 17:15:56.419987] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:01.153 [2024-07-23 17:15:56.420038] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:01.153 [2024-07-23 17:15:56.420091] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:01.153 [2024-07-23 17:15:56.420103] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16cd8e0 name raid_bdev1, state offline 00:22:01.153 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4166437 00:22:01.153 [2024-07-23 17:15:56.450265] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:01.411 17:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:01.411 00:22:01.411 real 0m21.248s 00:22:01.411 user 0m39.124s 00:22:01.411 sys 0m3.987s 00:22:01.411 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:01.411 17:15:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.411 ************************************ 00:22:01.411 END TEST raid_superblock_test 00:22:01.411 ************************************ 00:22:01.411 17:15:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:01.411 17:15:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:22:01.411 17:15:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:01.411 17:15:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:01.411 17:15:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:01.411 ************************************ 00:22:01.411 START TEST raid_read_error_test 00:22:01.411 ************************************ 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:01.411 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.XoEcvQTlYc 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4169536 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4169536 /var/tmp/spdk-raid.sock 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4169536 ']' 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:01.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:01.412 17:15:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.412 [2024-07-23 17:15:56.811821] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:22:01.412 [2024-07-23 17:15:56.811883] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4169536 ] 00:22:01.671 [2024-07-23 17:15:56.933837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:01.671 [2024-07-23 17:15:56.989133] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:01.671 [2024-07-23 17:15:57.051867] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:01.671 [2024-07-23 17:15:57.051904] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:02.607 17:15:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:02.607 17:15:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:02.607 17:15:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:02.607 17:15:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:02.607 BaseBdev1_malloc 00:22:02.607 17:15:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:02.865 true 00:22:02.865 17:15:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:03.124 [2024-07-23 17:15:58.425514] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:03.124 [2024-07-23 17:15:58.425560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.124 [2024-07-23 17:15:58.425579] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184e5c0 00:22:03.124 [2024-07-23 17:15:58.425591] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.124 [2024-07-23 17:15:58.427071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.124 [2024-07-23 17:15:58.427098] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:03.124 BaseBdev1 00:22:03.124 17:15:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:03.124 17:15:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:03.418 BaseBdev2_malloc 00:22:03.418 17:15:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:03.680 true 00:22:03.680 17:15:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:03.680 [2024-07-23 17:15:59.087833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:03.680 [2024-07-23 17:15:59.087876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.680 [2024-07-23 17:15:59.087903] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1848620 00:22:03.680 [2024-07-23 17:15:59.087917] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.680 [2024-07-23 17:15:59.089292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.680 [2024-07-23 17:15:59.089318] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:03.680 BaseBdev2 00:22:03.938 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:03.938 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:03.938 BaseBdev3_malloc 00:22:03.938 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:04.196 true 00:22:04.196 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:04.455 [2024-07-23 17:15:59.766062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:04.455 [2024-07-23 17:15:59.766107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.455 [2024-07-23 17:15:59.766126] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1848c00 00:22:04.455 [2024-07-23 17:15:59.766139] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.455 [2024-07-23 17:15:59.767532] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.455 [2024-07-23 17:15:59.767560] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:04.455 BaseBdev3 00:22:04.455 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:22:04.714 [2024-07-23 17:15:59.938559] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:04.714 [2024-07-23 17:15:59.939835] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:04.714 [2024-07-23 17:15:59.939904] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:04.714 [2024-07-23 17:15:59.940098] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x184a670 00:22:04.714 [2024-07-23 17:15:59.940109] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:04.714 [2024-07-23 17:15:59.940278] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1736a70 00:22:04.714 [2024-07-23 17:15:59.940425] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x184a670 00:22:04.714 [2024-07-23 17:15:59.940435] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x184a670 00:22:04.714 [2024-07-23 17:15:59.940531] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.714 17:15:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.972 17:16:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.972 "name": "raid_bdev1", 00:22:04.972 "uuid": "d5f9e899-36fe-4aca-b53c-c2287b9d6cf9", 00:22:04.972 "strip_size_kb": 0, 00:22:04.972 "state": "online", 00:22:04.972 "raid_level": "raid1", 00:22:04.972 "superblock": true, 00:22:04.972 "num_base_bdevs": 3, 00:22:04.972 "num_base_bdevs_discovered": 3, 00:22:04.972 "num_base_bdevs_operational": 3, 00:22:04.972 "base_bdevs_list": [ 00:22:04.972 { 00:22:04.972 "name": "BaseBdev1", 00:22:04.972 "uuid": "74de8f1c-d8a9-50b7-84ed-8ab9422b6ba2", 00:22:04.972 "is_configured": true, 00:22:04.972 "data_offset": 2048, 00:22:04.972 "data_size": 63488 00:22:04.972 }, 00:22:04.972 { 00:22:04.972 "name": "BaseBdev2", 00:22:04.972 "uuid": "f1b9ec3f-ee80-59e8-9d04-f6a831ff62c9", 00:22:04.972 "is_configured": true, 00:22:04.972 "data_offset": 2048, 00:22:04.972 "data_size": 63488 00:22:04.972 }, 00:22:04.972 { 00:22:04.972 "name": "BaseBdev3", 00:22:04.972 "uuid": "bb45db85-8aba-5b38-bee7-358f9a998d16", 00:22:04.972 "is_configured": true, 00:22:04.972 "data_offset": 2048, 00:22:04.972 "data_size": 63488 00:22:04.972 } 00:22:04.972 ] 00:22:04.972 }' 00:22:04.972 17:16:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.972 17:16:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.540 17:16:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:05.540 17:16:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:05.540 [2024-07-23 17:16:00.921458] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x184bf40 00:22:06.476 17:16:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.735 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.994 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.994 "name": "raid_bdev1", 00:22:06.994 "uuid": "d5f9e899-36fe-4aca-b53c-c2287b9d6cf9", 00:22:06.994 "strip_size_kb": 0, 00:22:06.994 "state": "online", 00:22:06.994 "raid_level": "raid1", 00:22:06.994 "superblock": true, 00:22:06.994 "num_base_bdevs": 3, 00:22:06.994 "num_base_bdevs_discovered": 3, 00:22:06.994 "num_base_bdevs_operational": 3, 00:22:06.994 "base_bdevs_list": [ 00:22:06.994 { 00:22:06.994 "name": "BaseBdev1", 00:22:06.994 "uuid": "74de8f1c-d8a9-50b7-84ed-8ab9422b6ba2", 00:22:06.994 "is_configured": true, 00:22:06.994 "data_offset": 2048, 00:22:06.994 "data_size": 63488 00:22:06.994 }, 00:22:06.994 { 00:22:06.994 "name": "BaseBdev2", 00:22:06.994 "uuid": "f1b9ec3f-ee80-59e8-9d04-f6a831ff62c9", 00:22:06.994 "is_configured": true, 00:22:06.994 "data_offset": 2048, 00:22:06.994 "data_size": 63488 00:22:06.994 }, 00:22:06.994 { 00:22:06.994 "name": "BaseBdev3", 00:22:06.994 "uuid": "bb45db85-8aba-5b38-bee7-358f9a998d16", 00:22:06.994 "is_configured": true, 00:22:06.994 "data_offset": 2048, 00:22:06.994 "data_size": 63488 00:22:06.994 } 00:22:06.994 ] 00:22:06.994 }' 00:22:06.994 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.994 17:16:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.562 17:16:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:07.821 [2024-07-23 17:16:03.113965] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:07.821 [2024-07-23 17:16:03.114011] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:07.821 [2024-07-23 17:16:03.117145] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:07.821 [2024-07-23 17:16:03.117184] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.821 [2024-07-23 17:16:03.117283] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:07.821 [2024-07-23 17:16:03.117294] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x184a670 name raid_bdev1, state offline 00:22:07.821 0 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4169536 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4169536 ']' 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4169536 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4169536 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4169536' 00:22:07.821 killing process with pid 4169536 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4169536 00:22:07.821 [2024-07-23 17:16:03.200068] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:07.821 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4169536 00:22:07.821 [2024-07-23 17:16:03.222075] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.XoEcvQTlYc 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:08.090 00:22:08.090 real 0m6.706s 00:22:08.090 user 0m10.563s 00:22:08.090 sys 0m1.216s 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:08.090 17:16:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.090 ************************************ 00:22:08.090 END TEST raid_read_error_test 00:22:08.090 ************************************ 00:22:08.090 17:16:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:08.090 17:16:03 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:22:08.090 17:16:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:08.090 17:16:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:08.090 17:16:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:08.354 ************************************ 00:22:08.354 START TEST raid_write_error_test 00:22:08.354 ************************************ 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:08.354 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.U2b1kgkrFO 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4170507 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4170507 /var/tmp/spdk-raid.sock 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4170507 ']' 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:08.355 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:08.355 17:16:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.355 [2024-07-23 17:16:03.618489] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:22:08.355 [2024-07-23 17:16:03.618565] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4170507 ] 00:22:08.355 [2024-07-23 17:16:03.752442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:08.614 [2024-07-23 17:16:03.807526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:08.614 [2024-07-23 17:16:03.866765] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:08.614 [2024-07-23 17:16:03.866791] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:09.179 17:16:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:09.179 17:16:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:09.179 17:16:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:09.179 17:16:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:09.437 BaseBdev1_malloc 00:22:09.437 17:16:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:09.696 true 00:22:09.696 17:16:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:10.262 [2024-07-23 17:16:05.478477] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:10.262 [2024-07-23 17:16:05.478529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.262 [2024-07-23 17:16:05.478550] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27e25c0 00:22:10.262 [2024-07-23 17:16:05.478562] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.262 [2024-07-23 17:16:05.480297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.262 [2024-07-23 17:16:05.480328] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:10.262 BaseBdev1 00:22:10.262 17:16:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:10.262 17:16:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:10.520 BaseBdev2_malloc 00:22:10.520 17:16:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:10.779 true 00:22:10.779 17:16:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:11.038 [2024-07-23 17:16:06.285243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:11.038 [2024-07-23 17:16:06.285289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.038 [2024-07-23 17:16:06.285309] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27dc620 00:22:11.038 [2024-07-23 17:16:06.285322] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.038 [2024-07-23 17:16:06.286743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.038 [2024-07-23 17:16:06.286773] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:11.038 BaseBdev2 00:22:11.038 17:16:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:11.038 17:16:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:11.297 BaseBdev3_malloc 00:22:11.297 17:16:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:11.555 true 00:22:11.555 17:16:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:11.814 [2024-07-23 17:16:07.031710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:11.814 [2024-07-23 17:16:07.031751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.814 [2024-07-23 17:16:07.031771] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27dcc00 00:22:11.814 [2024-07-23 17:16:07.031783] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.814 [2024-07-23 17:16:07.033185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.814 [2024-07-23 17:16:07.033211] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:11.814 BaseBdev3 00:22:11.814 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:22:12.073 [2024-07-23 17:16:07.280394] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:12.073 [2024-07-23 17:16:07.281646] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:12.073 [2024-07-23 17:16:07.281708] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:12.073 [2024-07-23 17:16:07.281906] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27de670 00:22:12.073 [2024-07-23 17:16:07.281918] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:12.073 [2024-07-23 17:16:07.282084] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26caa70 00:22:12.073 [2024-07-23 17:16:07.282229] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27de670 00:22:12.073 [2024-07-23 17:16:07.282240] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27de670 00:22:12.073 [2024-07-23 17:16:07.282334] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.073 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.332 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.332 "name": "raid_bdev1", 00:22:12.332 "uuid": "2fd1a5c0-4a71-433b-9171-ea34ae8b9574", 00:22:12.332 "strip_size_kb": 0, 00:22:12.332 "state": "online", 00:22:12.332 "raid_level": "raid1", 00:22:12.332 "superblock": true, 00:22:12.332 "num_base_bdevs": 3, 00:22:12.332 "num_base_bdevs_discovered": 3, 00:22:12.332 "num_base_bdevs_operational": 3, 00:22:12.332 "base_bdevs_list": [ 00:22:12.332 { 00:22:12.332 "name": "BaseBdev1", 00:22:12.332 "uuid": "b4d00989-7972-5caa-80f6-1ab7721b6ded", 00:22:12.332 "is_configured": true, 00:22:12.332 "data_offset": 2048, 00:22:12.332 "data_size": 63488 00:22:12.332 }, 00:22:12.332 { 00:22:12.332 "name": "BaseBdev2", 00:22:12.332 "uuid": "74a35311-31ae-575e-bec7-0b62cc294a02", 00:22:12.332 "is_configured": true, 00:22:12.332 "data_offset": 2048, 00:22:12.332 "data_size": 63488 00:22:12.332 }, 00:22:12.332 { 00:22:12.332 "name": "BaseBdev3", 00:22:12.332 "uuid": "2b119cc7-4cdc-527b-b4bf-7cad7fbdedf0", 00:22:12.332 "is_configured": true, 00:22:12.332 "data_offset": 2048, 00:22:12.332 "data_size": 63488 00:22:12.332 } 00:22:12.332 ] 00:22:12.332 }' 00:22:12.332 17:16:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.332 17:16:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.901 17:16:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:12.901 17:16:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:12.901 [2024-07-23 17:16:08.259265] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27dff40 00:22:13.838 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:14.097 [2024-07-23 17:16:09.379621] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:14.097 [2024-07-23 17:16:09.379690] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:14.098 [2024-07-23 17:16:09.379890] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x27dff40 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.098 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.358 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.358 "name": "raid_bdev1", 00:22:14.358 "uuid": "2fd1a5c0-4a71-433b-9171-ea34ae8b9574", 00:22:14.358 "strip_size_kb": 0, 00:22:14.358 "state": "online", 00:22:14.358 "raid_level": "raid1", 00:22:14.358 "superblock": true, 00:22:14.358 "num_base_bdevs": 3, 00:22:14.358 "num_base_bdevs_discovered": 2, 00:22:14.358 "num_base_bdevs_operational": 2, 00:22:14.358 "base_bdevs_list": [ 00:22:14.358 { 00:22:14.358 "name": null, 00:22:14.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.358 "is_configured": false, 00:22:14.358 "data_offset": 2048, 00:22:14.358 "data_size": 63488 00:22:14.358 }, 00:22:14.358 { 00:22:14.358 "name": "BaseBdev2", 00:22:14.358 "uuid": "74a35311-31ae-575e-bec7-0b62cc294a02", 00:22:14.358 "is_configured": true, 00:22:14.358 "data_offset": 2048, 00:22:14.358 "data_size": 63488 00:22:14.358 }, 00:22:14.358 { 00:22:14.358 "name": "BaseBdev3", 00:22:14.358 "uuid": "2b119cc7-4cdc-527b-b4bf-7cad7fbdedf0", 00:22:14.358 "is_configured": true, 00:22:14.358 "data_offset": 2048, 00:22:14.358 "data_size": 63488 00:22:14.358 } 00:22:14.358 ] 00:22:14.358 }' 00:22:14.358 17:16:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.358 17:16:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.927 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:15.187 [2024-07-23 17:16:10.362510] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:15.187 [2024-07-23 17:16:10.362541] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:15.187 [2024-07-23 17:16:10.365710] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:15.187 [2024-07-23 17:16:10.365742] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:15.187 [2024-07-23 17:16:10.365814] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:15.187 [2024-07-23 17:16:10.365826] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27de670 name raid_bdev1, state offline 00:22:15.187 0 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4170507 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4170507 ']' 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4170507 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4170507 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4170507' 00:22:15.187 killing process with pid 4170507 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4170507 00:22:15.187 [2024-07-23 17:16:10.444923] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:15.187 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4170507 00:22:15.187 [2024-07-23 17:16:10.465619] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.U2b1kgkrFO 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:15.447 00:22:15.447 real 0m7.144s 00:22:15.447 user 0m11.372s 00:22:15.447 sys 0m1.263s 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:15.447 17:16:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:15.447 ************************************ 00:22:15.447 END TEST raid_write_error_test 00:22:15.447 ************************************ 00:22:15.447 17:16:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:15.447 17:16:10 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:22:15.447 17:16:10 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:22:15.447 17:16:10 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:22:15.447 17:16:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:15.447 17:16:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:15.447 17:16:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:15.447 ************************************ 00:22:15.447 START TEST raid_state_function_test 00:22:15.447 ************************************ 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4171648 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4171648' 00:22:15.447 Process raid pid: 4171648 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4171648 /var/tmp/spdk-raid.sock 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4171648 ']' 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:15.447 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:15.447 17:16:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:15.447 [2024-07-23 17:16:10.848163] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:22:15.447 [2024-07-23 17:16:10.848238] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:15.705 [2024-07-23 17:16:10.984050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:15.705 [2024-07-23 17:16:11.039564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:15.705 [2024-07-23 17:16:11.103059] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:15.705 [2024-07-23 17:16:11.103091] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:16.641 [2024-07-23 17:16:11.939033] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:16.641 [2024-07-23 17:16:11.939073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:16.641 [2024-07-23 17:16:11.939084] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:16.641 [2024-07-23 17:16:11.939096] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:16.641 [2024-07-23 17:16:11.939105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:16.641 [2024-07-23 17:16:11.939116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:16.641 [2024-07-23 17:16:11.939124] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:16.641 [2024-07-23 17:16:11.939135] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.641 17:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:16.900 17:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.900 "name": "Existed_Raid", 00:22:16.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.901 "strip_size_kb": 64, 00:22:16.901 "state": "configuring", 00:22:16.901 "raid_level": "raid0", 00:22:16.901 "superblock": false, 00:22:16.901 "num_base_bdevs": 4, 00:22:16.901 "num_base_bdevs_discovered": 0, 00:22:16.901 "num_base_bdevs_operational": 4, 00:22:16.901 "base_bdevs_list": [ 00:22:16.901 { 00:22:16.901 "name": "BaseBdev1", 00:22:16.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.901 "is_configured": false, 00:22:16.901 "data_offset": 0, 00:22:16.901 "data_size": 0 00:22:16.901 }, 00:22:16.901 { 00:22:16.901 "name": "BaseBdev2", 00:22:16.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.901 "is_configured": false, 00:22:16.901 "data_offset": 0, 00:22:16.901 "data_size": 0 00:22:16.901 }, 00:22:16.901 { 00:22:16.901 "name": "BaseBdev3", 00:22:16.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.901 "is_configured": false, 00:22:16.901 "data_offset": 0, 00:22:16.901 "data_size": 0 00:22:16.901 }, 00:22:16.901 { 00:22:16.901 "name": "BaseBdev4", 00:22:16.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.901 "is_configured": false, 00:22:16.901 "data_offset": 0, 00:22:16.901 "data_size": 0 00:22:16.901 } 00:22:16.901 ] 00:22:16.901 }' 00:22:16.901 17:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.901 17:16:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:17.468 17:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:17.727 [2024-07-23 17:16:12.977707] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:17.727 [2024-07-23 17:16:12.977737] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23ec430 name Existed_Raid, state configuring 00:22:17.727 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:17.986 [2024-07-23 17:16:13.226388] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:17.986 [2024-07-23 17:16:13.226415] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:17.986 [2024-07-23 17:16:13.226425] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:17.986 [2024-07-23 17:16:13.226436] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:17.986 [2024-07-23 17:16:13.226445] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:17.986 [2024-07-23 17:16:13.226456] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:17.986 [2024-07-23 17:16:13.226465] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:17.986 [2024-07-23 17:16:13.226475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:17.986 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:18.245 [2024-07-23 17:16:13.488965] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:18.245 BaseBdev1 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:18.245 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:18.504 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:18.763 [ 00:22:18.763 { 00:22:18.763 "name": "BaseBdev1", 00:22:18.763 "aliases": [ 00:22:18.763 "21a74dcb-d85e-43f9-9465-f1c6652e9987" 00:22:18.763 ], 00:22:18.763 "product_name": "Malloc disk", 00:22:18.763 "block_size": 512, 00:22:18.763 "num_blocks": 65536, 00:22:18.763 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:18.763 "assigned_rate_limits": { 00:22:18.763 "rw_ios_per_sec": 0, 00:22:18.763 "rw_mbytes_per_sec": 0, 00:22:18.763 "r_mbytes_per_sec": 0, 00:22:18.763 "w_mbytes_per_sec": 0 00:22:18.763 }, 00:22:18.763 "claimed": true, 00:22:18.763 "claim_type": "exclusive_write", 00:22:18.763 "zoned": false, 00:22:18.763 "supported_io_types": { 00:22:18.763 "read": true, 00:22:18.763 "write": true, 00:22:18.764 "unmap": true, 00:22:18.764 "flush": true, 00:22:18.764 "reset": true, 00:22:18.764 "nvme_admin": false, 00:22:18.764 "nvme_io": false, 00:22:18.764 "nvme_io_md": false, 00:22:18.764 "write_zeroes": true, 00:22:18.764 "zcopy": true, 00:22:18.764 "get_zone_info": false, 00:22:18.764 "zone_management": false, 00:22:18.764 "zone_append": false, 00:22:18.764 "compare": false, 00:22:18.764 "compare_and_write": false, 00:22:18.764 "abort": true, 00:22:18.764 "seek_hole": false, 00:22:18.764 "seek_data": false, 00:22:18.764 "copy": true, 00:22:18.764 "nvme_iov_md": false 00:22:18.764 }, 00:22:18.764 "memory_domains": [ 00:22:18.764 { 00:22:18.764 "dma_device_id": "system", 00:22:18.764 "dma_device_type": 1 00:22:18.764 }, 00:22:18.764 { 00:22:18.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:18.764 "dma_device_type": 2 00:22:18.764 } 00:22:18.764 ], 00:22:18.764 "driver_specific": {} 00:22:18.764 } 00:22:18.764 ] 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.764 17:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.764 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.764 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.764 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.764 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.023 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.023 "name": "Existed_Raid", 00:22:19.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.023 "strip_size_kb": 64, 00:22:19.023 "state": "configuring", 00:22:19.023 "raid_level": "raid0", 00:22:19.023 "superblock": false, 00:22:19.023 "num_base_bdevs": 4, 00:22:19.023 "num_base_bdevs_discovered": 1, 00:22:19.023 "num_base_bdevs_operational": 4, 00:22:19.023 "base_bdevs_list": [ 00:22:19.023 { 00:22:19.023 "name": "BaseBdev1", 00:22:19.023 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:19.023 "is_configured": true, 00:22:19.023 "data_offset": 0, 00:22:19.023 "data_size": 65536 00:22:19.023 }, 00:22:19.023 { 00:22:19.023 "name": "BaseBdev2", 00:22:19.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.023 "is_configured": false, 00:22:19.023 "data_offset": 0, 00:22:19.023 "data_size": 0 00:22:19.023 }, 00:22:19.023 { 00:22:19.023 "name": "BaseBdev3", 00:22:19.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.023 "is_configured": false, 00:22:19.023 "data_offset": 0, 00:22:19.023 "data_size": 0 00:22:19.023 }, 00:22:19.023 { 00:22:19.023 "name": "BaseBdev4", 00:22:19.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.023 "is_configured": false, 00:22:19.023 "data_offset": 0, 00:22:19.023 "data_size": 0 00:22:19.023 } 00:22:19.023 ] 00:22:19.023 }' 00:22:19.023 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.023 17:16:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.591 17:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:19.591 [2024-07-23 17:16:14.988936] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:19.591 [2024-07-23 17:16:14.988974] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23ebd60 name Existed_Raid, state configuring 00:22:19.591 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:19.850 [2024-07-23 17:16:15.237616] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:19.850 [2024-07-23 17:16:15.239051] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:19.850 [2024-07-23 17:16:15.239096] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:19.850 [2024-07-23 17:16:15.239107] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:19.850 [2024-07-23 17:16:15.239119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:19.850 [2024-07-23 17:16:15.239128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:19.850 [2024-07-23 17:16:15.239139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.850 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:20.109 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.109 "name": "Existed_Raid", 00:22:20.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.109 "strip_size_kb": 64, 00:22:20.109 "state": "configuring", 00:22:20.109 "raid_level": "raid0", 00:22:20.109 "superblock": false, 00:22:20.109 "num_base_bdevs": 4, 00:22:20.109 "num_base_bdevs_discovered": 1, 00:22:20.109 "num_base_bdevs_operational": 4, 00:22:20.109 "base_bdevs_list": [ 00:22:20.109 { 00:22:20.109 "name": "BaseBdev1", 00:22:20.109 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:20.109 "is_configured": true, 00:22:20.109 "data_offset": 0, 00:22:20.109 "data_size": 65536 00:22:20.109 }, 00:22:20.109 { 00:22:20.109 "name": "BaseBdev2", 00:22:20.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.109 "is_configured": false, 00:22:20.109 "data_offset": 0, 00:22:20.109 "data_size": 0 00:22:20.109 }, 00:22:20.109 { 00:22:20.109 "name": "BaseBdev3", 00:22:20.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.109 "is_configured": false, 00:22:20.109 "data_offset": 0, 00:22:20.109 "data_size": 0 00:22:20.109 }, 00:22:20.109 { 00:22:20.109 "name": "BaseBdev4", 00:22:20.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.109 "is_configured": false, 00:22:20.109 "data_offset": 0, 00:22:20.109 "data_size": 0 00:22:20.109 } 00:22:20.109 ] 00:22:20.109 }' 00:22:20.109 17:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.109 17:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:21.115 17:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:21.373 [2024-07-23 17:16:16.688817] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:21.373 BaseBdev2 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:21.373 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:21.632 17:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:21.891 [ 00:22:21.891 { 00:22:21.891 "name": "BaseBdev2", 00:22:21.891 "aliases": [ 00:22:21.891 "760e0812-fb99-449c-9b1e-d93ea123c399" 00:22:21.891 ], 00:22:21.891 "product_name": "Malloc disk", 00:22:21.891 "block_size": 512, 00:22:21.891 "num_blocks": 65536, 00:22:21.891 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:21.891 "assigned_rate_limits": { 00:22:21.891 "rw_ios_per_sec": 0, 00:22:21.891 "rw_mbytes_per_sec": 0, 00:22:21.891 "r_mbytes_per_sec": 0, 00:22:21.891 "w_mbytes_per_sec": 0 00:22:21.891 }, 00:22:21.891 "claimed": true, 00:22:21.891 "claim_type": "exclusive_write", 00:22:21.891 "zoned": false, 00:22:21.891 "supported_io_types": { 00:22:21.891 "read": true, 00:22:21.891 "write": true, 00:22:21.891 "unmap": true, 00:22:21.891 "flush": true, 00:22:21.891 "reset": true, 00:22:21.891 "nvme_admin": false, 00:22:21.891 "nvme_io": false, 00:22:21.891 "nvme_io_md": false, 00:22:21.891 "write_zeroes": true, 00:22:21.891 "zcopy": true, 00:22:21.891 "get_zone_info": false, 00:22:21.891 "zone_management": false, 00:22:21.891 "zone_append": false, 00:22:21.891 "compare": false, 00:22:21.891 "compare_and_write": false, 00:22:21.891 "abort": true, 00:22:21.891 "seek_hole": false, 00:22:21.891 "seek_data": false, 00:22:21.891 "copy": true, 00:22:21.891 "nvme_iov_md": false 00:22:21.891 }, 00:22:21.891 "memory_domains": [ 00:22:21.891 { 00:22:21.891 "dma_device_id": "system", 00:22:21.891 "dma_device_type": 1 00:22:21.891 }, 00:22:21.891 { 00:22:21.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:21.891 "dma_device_type": 2 00:22:21.891 } 00:22:21.891 ], 00:22:21.891 "driver_specific": {} 00:22:21.891 } 00:22:21.891 ] 00:22:21.891 17:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.892 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:22.150 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.150 "name": "Existed_Raid", 00:22:22.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.150 "strip_size_kb": 64, 00:22:22.150 "state": "configuring", 00:22:22.150 "raid_level": "raid0", 00:22:22.150 "superblock": false, 00:22:22.150 "num_base_bdevs": 4, 00:22:22.150 "num_base_bdevs_discovered": 2, 00:22:22.150 "num_base_bdevs_operational": 4, 00:22:22.150 "base_bdevs_list": [ 00:22:22.150 { 00:22:22.150 "name": "BaseBdev1", 00:22:22.151 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:22.151 "is_configured": true, 00:22:22.151 "data_offset": 0, 00:22:22.151 "data_size": 65536 00:22:22.151 }, 00:22:22.151 { 00:22:22.151 "name": "BaseBdev2", 00:22:22.151 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:22.151 "is_configured": true, 00:22:22.151 "data_offset": 0, 00:22:22.151 "data_size": 65536 00:22:22.151 }, 00:22:22.151 { 00:22:22.151 "name": "BaseBdev3", 00:22:22.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.151 "is_configured": false, 00:22:22.151 "data_offset": 0, 00:22:22.151 "data_size": 0 00:22:22.151 }, 00:22:22.151 { 00:22:22.151 "name": "BaseBdev4", 00:22:22.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.151 "is_configured": false, 00:22:22.151 "data_offset": 0, 00:22:22.151 "data_size": 0 00:22:22.151 } 00:22:22.151 ] 00:22:22.151 }' 00:22:22.151 17:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.151 17:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.718 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:22.976 [2024-07-23 17:16:18.332535] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:22.976 BaseBdev3 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:22.976 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:23.235 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:23.493 [ 00:22:23.493 { 00:22:23.493 "name": "BaseBdev3", 00:22:23.493 "aliases": [ 00:22:23.493 "1190096e-8dfe-4c49-a9d4-16cf4946f563" 00:22:23.493 ], 00:22:23.493 "product_name": "Malloc disk", 00:22:23.493 "block_size": 512, 00:22:23.493 "num_blocks": 65536, 00:22:23.493 "uuid": "1190096e-8dfe-4c49-a9d4-16cf4946f563", 00:22:23.493 "assigned_rate_limits": { 00:22:23.493 "rw_ios_per_sec": 0, 00:22:23.493 "rw_mbytes_per_sec": 0, 00:22:23.493 "r_mbytes_per_sec": 0, 00:22:23.493 "w_mbytes_per_sec": 0 00:22:23.493 }, 00:22:23.493 "claimed": true, 00:22:23.493 "claim_type": "exclusive_write", 00:22:23.493 "zoned": false, 00:22:23.493 "supported_io_types": { 00:22:23.493 "read": true, 00:22:23.493 "write": true, 00:22:23.493 "unmap": true, 00:22:23.493 "flush": true, 00:22:23.493 "reset": true, 00:22:23.493 "nvme_admin": false, 00:22:23.493 "nvme_io": false, 00:22:23.493 "nvme_io_md": false, 00:22:23.493 "write_zeroes": true, 00:22:23.493 "zcopy": true, 00:22:23.493 "get_zone_info": false, 00:22:23.493 "zone_management": false, 00:22:23.493 "zone_append": false, 00:22:23.493 "compare": false, 00:22:23.493 "compare_and_write": false, 00:22:23.493 "abort": true, 00:22:23.493 "seek_hole": false, 00:22:23.493 "seek_data": false, 00:22:23.493 "copy": true, 00:22:23.493 "nvme_iov_md": false 00:22:23.493 }, 00:22:23.493 "memory_domains": [ 00:22:23.493 { 00:22:23.493 "dma_device_id": "system", 00:22:23.493 "dma_device_type": 1 00:22:23.493 }, 00:22:23.493 { 00:22:23.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.493 "dma_device_type": 2 00:22:23.493 } 00:22:23.493 ], 00:22:23.493 "driver_specific": {} 00:22:23.493 } 00:22:23.493 ] 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.493 17:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:23.751 17:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.751 "name": "Existed_Raid", 00:22:23.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.752 "strip_size_kb": 64, 00:22:23.752 "state": "configuring", 00:22:23.752 "raid_level": "raid0", 00:22:23.752 "superblock": false, 00:22:23.752 "num_base_bdevs": 4, 00:22:23.752 "num_base_bdevs_discovered": 3, 00:22:23.752 "num_base_bdevs_operational": 4, 00:22:23.752 "base_bdevs_list": [ 00:22:23.752 { 00:22:23.752 "name": "BaseBdev1", 00:22:23.752 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:23.752 "is_configured": true, 00:22:23.752 "data_offset": 0, 00:22:23.752 "data_size": 65536 00:22:23.752 }, 00:22:23.752 { 00:22:23.752 "name": "BaseBdev2", 00:22:23.752 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:23.752 "is_configured": true, 00:22:23.752 "data_offset": 0, 00:22:23.752 "data_size": 65536 00:22:23.752 }, 00:22:23.752 { 00:22:23.752 "name": "BaseBdev3", 00:22:23.752 "uuid": "1190096e-8dfe-4c49-a9d4-16cf4946f563", 00:22:23.752 "is_configured": true, 00:22:23.752 "data_offset": 0, 00:22:23.752 "data_size": 65536 00:22:23.752 }, 00:22:23.752 { 00:22:23.752 "name": "BaseBdev4", 00:22:23.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.752 "is_configured": false, 00:22:23.752 "data_offset": 0, 00:22:23.752 "data_size": 0 00:22:23.752 } 00:22:23.752 ] 00:22:23.752 }' 00:22:23.752 17:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.752 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.318 17:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:24.577 [2024-07-23 17:16:19.932224] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:24.577 [2024-07-23 17:16:19.932262] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23eb9b0 00:22:24.577 [2024-07-23 17:16:19.932270] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:22:24.577 [2024-07-23 17:16:19.932465] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2499260 00:22:24.577 [2024-07-23 17:16:19.932583] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23eb9b0 00:22:24.577 [2024-07-23 17:16:19.932593] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23eb9b0 00:22:24.577 [2024-07-23 17:16:19.932755] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.577 BaseBdev4 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:24.577 17:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:24.835 17:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:25.093 [ 00:22:25.093 { 00:22:25.093 "name": "BaseBdev4", 00:22:25.093 "aliases": [ 00:22:25.093 "9bb6ac31-944f-4a44-b594-3f87eec60fc1" 00:22:25.093 ], 00:22:25.093 "product_name": "Malloc disk", 00:22:25.093 "block_size": 512, 00:22:25.093 "num_blocks": 65536, 00:22:25.093 "uuid": "9bb6ac31-944f-4a44-b594-3f87eec60fc1", 00:22:25.094 "assigned_rate_limits": { 00:22:25.094 "rw_ios_per_sec": 0, 00:22:25.094 "rw_mbytes_per_sec": 0, 00:22:25.094 "r_mbytes_per_sec": 0, 00:22:25.094 "w_mbytes_per_sec": 0 00:22:25.094 }, 00:22:25.094 "claimed": true, 00:22:25.094 "claim_type": "exclusive_write", 00:22:25.094 "zoned": false, 00:22:25.094 "supported_io_types": { 00:22:25.094 "read": true, 00:22:25.094 "write": true, 00:22:25.094 "unmap": true, 00:22:25.094 "flush": true, 00:22:25.094 "reset": true, 00:22:25.094 "nvme_admin": false, 00:22:25.094 "nvme_io": false, 00:22:25.094 "nvme_io_md": false, 00:22:25.094 "write_zeroes": true, 00:22:25.094 "zcopy": true, 00:22:25.094 "get_zone_info": false, 00:22:25.094 "zone_management": false, 00:22:25.094 "zone_append": false, 00:22:25.094 "compare": false, 00:22:25.094 "compare_and_write": false, 00:22:25.094 "abort": true, 00:22:25.094 "seek_hole": false, 00:22:25.094 "seek_data": false, 00:22:25.094 "copy": true, 00:22:25.094 "nvme_iov_md": false 00:22:25.094 }, 00:22:25.094 "memory_domains": [ 00:22:25.094 { 00:22:25.094 "dma_device_id": "system", 00:22:25.094 "dma_device_type": 1 00:22:25.094 }, 00:22:25.094 { 00:22:25.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.094 "dma_device_type": 2 00:22:25.094 } 00:22:25.094 ], 00:22:25.094 "driver_specific": {} 00:22:25.094 } 00:22:25.094 ] 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.094 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:25.352 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:25.352 "name": "Existed_Raid", 00:22:25.352 "uuid": "e50c9269-0239-4a88-b3aa-44859405768c", 00:22:25.352 "strip_size_kb": 64, 00:22:25.352 "state": "online", 00:22:25.352 "raid_level": "raid0", 00:22:25.352 "superblock": false, 00:22:25.352 "num_base_bdevs": 4, 00:22:25.352 "num_base_bdevs_discovered": 4, 00:22:25.352 "num_base_bdevs_operational": 4, 00:22:25.352 "base_bdevs_list": [ 00:22:25.352 { 00:22:25.352 "name": "BaseBdev1", 00:22:25.352 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:25.352 "is_configured": true, 00:22:25.352 "data_offset": 0, 00:22:25.352 "data_size": 65536 00:22:25.352 }, 00:22:25.352 { 00:22:25.352 "name": "BaseBdev2", 00:22:25.352 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:25.352 "is_configured": true, 00:22:25.352 "data_offset": 0, 00:22:25.352 "data_size": 65536 00:22:25.352 }, 00:22:25.352 { 00:22:25.352 "name": "BaseBdev3", 00:22:25.352 "uuid": "1190096e-8dfe-4c49-a9d4-16cf4946f563", 00:22:25.352 "is_configured": true, 00:22:25.352 "data_offset": 0, 00:22:25.352 "data_size": 65536 00:22:25.352 }, 00:22:25.352 { 00:22:25.352 "name": "BaseBdev4", 00:22:25.352 "uuid": "9bb6ac31-944f-4a44-b594-3f87eec60fc1", 00:22:25.352 "is_configured": true, 00:22:25.352 "data_offset": 0, 00:22:25.352 "data_size": 65536 00:22:25.352 } 00:22:25.352 ] 00:22:25.352 }' 00:22:25.352 17:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:25.352 17:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:26.288 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:26.546 [2024-07-23 17:16:21.809526] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:26.546 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:26.546 "name": "Existed_Raid", 00:22:26.546 "aliases": [ 00:22:26.546 "e50c9269-0239-4a88-b3aa-44859405768c" 00:22:26.546 ], 00:22:26.546 "product_name": "Raid Volume", 00:22:26.546 "block_size": 512, 00:22:26.546 "num_blocks": 262144, 00:22:26.546 "uuid": "e50c9269-0239-4a88-b3aa-44859405768c", 00:22:26.546 "assigned_rate_limits": { 00:22:26.546 "rw_ios_per_sec": 0, 00:22:26.546 "rw_mbytes_per_sec": 0, 00:22:26.546 "r_mbytes_per_sec": 0, 00:22:26.546 "w_mbytes_per_sec": 0 00:22:26.546 }, 00:22:26.546 "claimed": false, 00:22:26.546 "zoned": false, 00:22:26.546 "supported_io_types": { 00:22:26.546 "read": true, 00:22:26.546 "write": true, 00:22:26.546 "unmap": true, 00:22:26.546 "flush": true, 00:22:26.546 "reset": true, 00:22:26.546 "nvme_admin": false, 00:22:26.546 "nvme_io": false, 00:22:26.546 "nvme_io_md": false, 00:22:26.546 "write_zeroes": true, 00:22:26.546 "zcopy": false, 00:22:26.546 "get_zone_info": false, 00:22:26.546 "zone_management": false, 00:22:26.546 "zone_append": false, 00:22:26.546 "compare": false, 00:22:26.546 "compare_and_write": false, 00:22:26.546 "abort": false, 00:22:26.546 "seek_hole": false, 00:22:26.546 "seek_data": false, 00:22:26.546 "copy": false, 00:22:26.546 "nvme_iov_md": false 00:22:26.546 }, 00:22:26.546 "memory_domains": [ 00:22:26.546 { 00:22:26.546 "dma_device_id": "system", 00:22:26.546 "dma_device_type": 1 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.546 "dma_device_type": 2 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "system", 00:22:26.546 "dma_device_type": 1 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.546 "dma_device_type": 2 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "system", 00:22:26.546 "dma_device_type": 1 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.546 "dma_device_type": 2 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "system", 00:22:26.546 "dma_device_type": 1 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.546 "dma_device_type": 2 00:22:26.546 } 00:22:26.546 ], 00:22:26.546 "driver_specific": { 00:22:26.546 "raid": { 00:22:26.546 "uuid": "e50c9269-0239-4a88-b3aa-44859405768c", 00:22:26.546 "strip_size_kb": 64, 00:22:26.546 "state": "online", 00:22:26.546 "raid_level": "raid0", 00:22:26.546 "superblock": false, 00:22:26.546 "num_base_bdevs": 4, 00:22:26.546 "num_base_bdevs_discovered": 4, 00:22:26.546 "num_base_bdevs_operational": 4, 00:22:26.546 "base_bdevs_list": [ 00:22:26.546 { 00:22:26.546 "name": "BaseBdev1", 00:22:26.546 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:26.546 "is_configured": true, 00:22:26.546 "data_offset": 0, 00:22:26.546 "data_size": 65536 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "name": "BaseBdev2", 00:22:26.546 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:26.546 "is_configured": true, 00:22:26.546 "data_offset": 0, 00:22:26.546 "data_size": 65536 00:22:26.546 }, 00:22:26.546 { 00:22:26.546 "name": "BaseBdev3", 00:22:26.546 "uuid": "1190096e-8dfe-4c49-a9d4-16cf4946f563", 00:22:26.546 "is_configured": true, 00:22:26.546 "data_offset": 0, 00:22:26.546 "data_size": 65536 00:22:26.547 }, 00:22:26.547 { 00:22:26.547 "name": "BaseBdev4", 00:22:26.547 "uuid": "9bb6ac31-944f-4a44-b594-3f87eec60fc1", 00:22:26.547 "is_configured": true, 00:22:26.547 "data_offset": 0, 00:22:26.547 "data_size": 65536 00:22:26.547 } 00:22:26.547 ] 00:22:26.547 } 00:22:26.547 } 00:22:26.547 }' 00:22:26.547 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:26.547 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:26.547 BaseBdev2 00:22:26.547 BaseBdev3 00:22:26.547 BaseBdev4' 00:22:26.547 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.547 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:26.547 17:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:26.805 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:26.805 "name": "BaseBdev1", 00:22:26.805 "aliases": [ 00:22:26.805 "21a74dcb-d85e-43f9-9465-f1c6652e9987" 00:22:26.805 ], 00:22:26.805 "product_name": "Malloc disk", 00:22:26.805 "block_size": 512, 00:22:26.805 "num_blocks": 65536, 00:22:26.805 "uuid": "21a74dcb-d85e-43f9-9465-f1c6652e9987", 00:22:26.805 "assigned_rate_limits": { 00:22:26.805 "rw_ios_per_sec": 0, 00:22:26.805 "rw_mbytes_per_sec": 0, 00:22:26.805 "r_mbytes_per_sec": 0, 00:22:26.805 "w_mbytes_per_sec": 0 00:22:26.805 }, 00:22:26.805 "claimed": true, 00:22:26.805 "claim_type": "exclusive_write", 00:22:26.805 "zoned": false, 00:22:26.805 "supported_io_types": { 00:22:26.805 "read": true, 00:22:26.805 "write": true, 00:22:26.805 "unmap": true, 00:22:26.805 "flush": true, 00:22:26.805 "reset": true, 00:22:26.805 "nvme_admin": false, 00:22:26.805 "nvme_io": false, 00:22:26.805 "nvme_io_md": false, 00:22:26.805 "write_zeroes": true, 00:22:26.805 "zcopy": true, 00:22:26.805 "get_zone_info": false, 00:22:26.805 "zone_management": false, 00:22:26.805 "zone_append": false, 00:22:26.805 "compare": false, 00:22:26.805 "compare_and_write": false, 00:22:26.805 "abort": true, 00:22:26.805 "seek_hole": false, 00:22:26.805 "seek_data": false, 00:22:26.805 "copy": true, 00:22:26.805 "nvme_iov_md": false 00:22:26.805 }, 00:22:26.805 "memory_domains": [ 00:22:26.805 { 00:22:26.805 "dma_device_id": "system", 00:22:26.805 "dma_device_type": 1 00:22:26.805 }, 00:22:26.805 { 00:22:26.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.805 "dma_device_type": 2 00:22:26.805 } 00:22:26.805 ], 00:22:26.805 "driver_specific": {} 00:22:26.805 }' 00:22:26.805 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.805 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.805 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:26.805 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:27.064 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:27.322 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:27.322 "name": "BaseBdev2", 00:22:27.322 "aliases": [ 00:22:27.322 "760e0812-fb99-449c-9b1e-d93ea123c399" 00:22:27.322 ], 00:22:27.322 "product_name": "Malloc disk", 00:22:27.322 "block_size": 512, 00:22:27.322 "num_blocks": 65536, 00:22:27.322 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:27.322 "assigned_rate_limits": { 00:22:27.322 "rw_ios_per_sec": 0, 00:22:27.322 "rw_mbytes_per_sec": 0, 00:22:27.322 "r_mbytes_per_sec": 0, 00:22:27.322 "w_mbytes_per_sec": 0 00:22:27.322 }, 00:22:27.322 "claimed": true, 00:22:27.322 "claim_type": "exclusive_write", 00:22:27.322 "zoned": false, 00:22:27.322 "supported_io_types": { 00:22:27.322 "read": true, 00:22:27.322 "write": true, 00:22:27.322 "unmap": true, 00:22:27.322 "flush": true, 00:22:27.322 "reset": true, 00:22:27.322 "nvme_admin": false, 00:22:27.322 "nvme_io": false, 00:22:27.322 "nvme_io_md": false, 00:22:27.322 "write_zeroes": true, 00:22:27.322 "zcopy": true, 00:22:27.322 "get_zone_info": false, 00:22:27.323 "zone_management": false, 00:22:27.323 "zone_append": false, 00:22:27.323 "compare": false, 00:22:27.323 "compare_and_write": false, 00:22:27.323 "abort": true, 00:22:27.323 "seek_hole": false, 00:22:27.323 "seek_data": false, 00:22:27.323 "copy": true, 00:22:27.323 "nvme_iov_md": false 00:22:27.323 }, 00:22:27.323 "memory_domains": [ 00:22:27.323 { 00:22:27.323 "dma_device_id": "system", 00:22:27.323 "dma_device_type": 1 00:22:27.323 }, 00:22:27.323 { 00:22:27.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:27.323 "dma_device_type": 2 00:22:27.323 } 00:22:27.323 ], 00:22:27.323 "driver_specific": {} 00:22:27.323 }' 00:22:27.323 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.581 17:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:27.840 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:28.101 "name": "BaseBdev3", 00:22:28.101 "aliases": [ 00:22:28.101 "1190096e-8dfe-4c49-a9d4-16cf4946f563" 00:22:28.101 ], 00:22:28.101 "product_name": "Malloc disk", 00:22:28.101 "block_size": 512, 00:22:28.101 "num_blocks": 65536, 00:22:28.101 "uuid": "1190096e-8dfe-4c49-a9d4-16cf4946f563", 00:22:28.101 "assigned_rate_limits": { 00:22:28.101 "rw_ios_per_sec": 0, 00:22:28.101 "rw_mbytes_per_sec": 0, 00:22:28.101 "r_mbytes_per_sec": 0, 00:22:28.101 "w_mbytes_per_sec": 0 00:22:28.101 }, 00:22:28.101 "claimed": true, 00:22:28.101 "claim_type": "exclusive_write", 00:22:28.101 "zoned": false, 00:22:28.101 "supported_io_types": { 00:22:28.101 "read": true, 00:22:28.101 "write": true, 00:22:28.101 "unmap": true, 00:22:28.101 "flush": true, 00:22:28.101 "reset": true, 00:22:28.101 "nvme_admin": false, 00:22:28.101 "nvme_io": false, 00:22:28.101 "nvme_io_md": false, 00:22:28.101 "write_zeroes": true, 00:22:28.101 "zcopy": true, 00:22:28.101 "get_zone_info": false, 00:22:28.101 "zone_management": false, 00:22:28.101 "zone_append": false, 00:22:28.101 "compare": false, 00:22:28.101 "compare_and_write": false, 00:22:28.101 "abort": true, 00:22:28.101 "seek_hole": false, 00:22:28.101 "seek_data": false, 00:22:28.101 "copy": true, 00:22:28.101 "nvme_iov_md": false 00:22:28.101 }, 00:22:28.101 "memory_domains": [ 00:22:28.101 { 00:22:28.101 "dma_device_id": "system", 00:22:28.101 "dma_device_type": 1 00:22:28.101 }, 00:22:28.101 { 00:22:28.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.101 "dma_device_type": 2 00:22:28.101 } 00:22:28.101 ], 00:22:28.101 "driver_specific": {} 00:22:28.101 }' 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:28.101 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:28.361 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:28.620 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:28.620 "name": "BaseBdev4", 00:22:28.620 "aliases": [ 00:22:28.620 "9bb6ac31-944f-4a44-b594-3f87eec60fc1" 00:22:28.620 ], 00:22:28.620 "product_name": "Malloc disk", 00:22:28.620 "block_size": 512, 00:22:28.620 "num_blocks": 65536, 00:22:28.620 "uuid": "9bb6ac31-944f-4a44-b594-3f87eec60fc1", 00:22:28.620 "assigned_rate_limits": { 00:22:28.620 "rw_ios_per_sec": 0, 00:22:28.620 "rw_mbytes_per_sec": 0, 00:22:28.620 "r_mbytes_per_sec": 0, 00:22:28.620 "w_mbytes_per_sec": 0 00:22:28.620 }, 00:22:28.620 "claimed": true, 00:22:28.620 "claim_type": "exclusive_write", 00:22:28.620 "zoned": false, 00:22:28.620 "supported_io_types": { 00:22:28.620 "read": true, 00:22:28.620 "write": true, 00:22:28.620 "unmap": true, 00:22:28.620 "flush": true, 00:22:28.620 "reset": true, 00:22:28.620 "nvme_admin": false, 00:22:28.620 "nvme_io": false, 00:22:28.620 "nvme_io_md": false, 00:22:28.620 "write_zeroes": true, 00:22:28.620 "zcopy": true, 00:22:28.620 "get_zone_info": false, 00:22:28.620 "zone_management": false, 00:22:28.620 "zone_append": false, 00:22:28.620 "compare": false, 00:22:28.620 "compare_and_write": false, 00:22:28.620 "abort": true, 00:22:28.620 "seek_hole": false, 00:22:28.620 "seek_data": false, 00:22:28.620 "copy": true, 00:22:28.620 "nvme_iov_md": false 00:22:28.620 }, 00:22:28.620 "memory_domains": [ 00:22:28.620 { 00:22:28.620 "dma_device_id": "system", 00:22:28.620 "dma_device_type": 1 00:22:28.620 }, 00:22:28.620 { 00:22:28.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.620 "dma_device_type": 2 00:22:28.620 } 00:22:28.620 ], 00:22:28.620 "driver_specific": {} 00:22:28.620 }' 00:22:28.620 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.620 17:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.620 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:28.620 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:28.878 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:29.136 [2024-07-23 17:16:24.508576] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:29.136 [2024-07-23 17:16:24.508603] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:29.136 [2024-07-23 17:16:24.508655] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:29.136 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.137 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.137 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.137 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.137 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.137 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:29.395 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.395 "name": "Existed_Raid", 00:22:29.395 "uuid": "e50c9269-0239-4a88-b3aa-44859405768c", 00:22:29.395 "strip_size_kb": 64, 00:22:29.395 "state": "offline", 00:22:29.395 "raid_level": "raid0", 00:22:29.395 "superblock": false, 00:22:29.395 "num_base_bdevs": 4, 00:22:29.395 "num_base_bdevs_discovered": 3, 00:22:29.395 "num_base_bdevs_operational": 3, 00:22:29.395 "base_bdevs_list": [ 00:22:29.395 { 00:22:29.395 "name": null, 00:22:29.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.395 "is_configured": false, 00:22:29.395 "data_offset": 0, 00:22:29.395 "data_size": 65536 00:22:29.395 }, 00:22:29.395 { 00:22:29.395 "name": "BaseBdev2", 00:22:29.395 "uuid": "760e0812-fb99-449c-9b1e-d93ea123c399", 00:22:29.395 "is_configured": true, 00:22:29.395 "data_offset": 0, 00:22:29.395 "data_size": 65536 00:22:29.395 }, 00:22:29.395 { 00:22:29.395 "name": "BaseBdev3", 00:22:29.395 "uuid": "1190096e-8dfe-4c49-a9d4-16cf4946f563", 00:22:29.395 "is_configured": true, 00:22:29.395 "data_offset": 0, 00:22:29.395 "data_size": 65536 00:22:29.396 }, 00:22:29.396 { 00:22:29.396 "name": "BaseBdev4", 00:22:29.396 "uuid": "9bb6ac31-944f-4a44-b594-3f87eec60fc1", 00:22:29.396 "is_configured": true, 00:22:29.396 "data_offset": 0, 00:22:29.396 "data_size": 65536 00:22:29.396 } 00:22:29.396 ] 00:22:29.396 }' 00:22:29.396 17:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.396 17:16:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:30.331 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:30.590 [2024-07-23 17:16:25.865336] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:30.590 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:30.590 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:30.590 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.590 17:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:30.848 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:30.848 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:30.848 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:31.107 [2024-07-23 17:16:26.371016] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:31.107 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:31.107 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:31.107 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.107 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:31.365 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:31.365 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:31.365 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:31.624 [2024-07-23 17:16:26.862824] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:31.624 [2024-07-23 17:16:26.862863] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23eb9b0 name Existed_Raid, state offline 00:22:31.624 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:31.624 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:31.624 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.624 17:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:31.882 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:31.882 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:31.883 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:31.883 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:31.883 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:31.883 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:32.140 BaseBdev2 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:32.140 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:32.398 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:32.657 [ 00:22:32.657 { 00:22:32.657 "name": "BaseBdev2", 00:22:32.657 "aliases": [ 00:22:32.657 "095580b5-8689-4036-b34a-130d01f38561" 00:22:32.657 ], 00:22:32.657 "product_name": "Malloc disk", 00:22:32.657 "block_size": 512, 00:22:32.657 "num_blocks": 65536, 00:22:32.657 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:32.657 "assigned_rate_limits": { 00:22:32.657 "rw_ios_per_sec": 0, 00:22:32.657 "rw_mbytes_per_sec": 0, 00:22:32.657 "r_mbytes_per_sec": 0, 00:22:32.657 "w_mbytes_per_sec": 0 00:22:32.657 }, 00:22:32.657 "claimed": false, 00:22:32.657 "zoned": false, 00:22:32.657 "supported_io_types": { 00:22:32.657 "read": true, 00:22:32.657 "write": true, 00:22:32.657 "unmap": true, 00:22:32.657 "flush": true, 00:22:32.657 "reset": true, 00:22:32.657 "nvme_admin": false, 00:22:32.657 "nvme_io": false, 00:22:32.657 "nvme_io_md": false, 00:22:32.657 "write_zeroes": true, 00:22:32.657 "zcopy": true, 00:22:32.657 "get_zone_info": false, 00:22:32.657 "zone_management": false, 00:22:32.657 "zone_append": false, 00:22:32.657 "compare": false, 00:22:32.657 "compare_and_write": false, 00:22:32.657 "abort": true, 00:22:32.657 "seek_hole": false, 00:22:32.657 "seek_data": false, 00:22:32.657 "copy": true, 00:22:32.657 "nvme_iov_md": false 00:22:32.657 }, 00:22:32.657 "memory_domains": [ 00:22:32.657 { 00:22:32.657 "dma_device_id": "system", 00:22:32.657 "dma_device_type": 1 00:22:32.657 }, 00:22:32.657 { 00:22:32.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.657 "dma_device_type": 2 00:22:32.657 } 00:22:32.657 ], 00:22:32.657 "driver_specific": {} 00:22:32.657 } 00:22:32.657 ] 00:22:32.657 17:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:32.657 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:32.657 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:32.657 17:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:32.915 BaseBdev3 00:22:32.915 17:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:32.916 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:32.916 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:32.916 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:32.916 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:32.916 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:32.916 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:33.174 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:33.174 [ 00:22:33.174 { 00:22:33.174 "name": "BaseBdev3", 00:22:33.174 "aliases": [ 00:22:33.174 "884df540-1a2b-453c-b871-3ec3999e321a" 00:22:33.174 ], 00:22:33.174 "product_name": "Malloc disk", 00:22:33.174 "block_size": 512, 00:22:33.174 "num_blocks": 65536, 00:22:33.174 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:33.174 "assigned_rate_limits": { 00:22:33.174 "rw_ios_per_sec": 0, 00:22:33.174 "rw_mbytes_per_sec": 0, 00:22:33.174 "r_mbytes_per_sec": 0, 00:22:33.174 "w_mbytes_per_sec": 0 00:22:33.174 }, 00:22:33.174 "claimed": false, 00:22:33.174 "zoned": false, 00:22:33.174 "supported_io_types": { 00:22:33.174 "read": true, 00:22:33.174 "write": true, 00:22:33.174 "unmap": true, 00:22:33.174 "flush": true, 00:22:33.174 "reset": true, 00:22:33.174 "nvme_admin": false, 00:22:33.174 "nvme_io": false, 00:22:33.174 "nvme_io_md": false, 00:22:33.174 "write_zeroes": true, 00:22:33.174 "zcopy": true, 00:22:33.174 "get_zone_info": false, 00:22:33.174 "zone_management": false, 00:22:33.174 "zone_append": false, 00:22:33.174 "compare": false, 00:22:33.174 "compare_and_write": false, 00:22:33.174 "abort": true, 00:22:33.174 "seek_hole": false, 00:22:33.174 "seek_data": false, 00:22:33.174 "copy": true, 00:22:33.174 "nvme_iov_md": false 00:22:33.174 }, 00:22:33.174 "memory_domains": [ 00:22:33.174 { 00:22:33.174 "dma_device_id": "system", 00:22:33.174 "dma_device_type": 1 00:22:33.174 }, 00:22:33.174 { 00:22:33.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.174 "dma_device_type": 2 00:22:33.174 } 00:22:33.174 ], 00:22:33.174 "driver_specific": {} 00:22:33.174 } 00:22:33.174 ] 00:22:33.432 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:33.432 17:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:33.432 17:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:33.432 17:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:33.691 BaseBdev4 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:33.691 17:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:33.950 17:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:33.950 [ 00:22:33.950 { 00:22:33.950 "name": "BaseBdev4", 00:22:33.950 "aliases": [ 00:22:33.950 "e234e45b-8cbd-44e8-b39b-d0af31251085" 00:22:33.950 ], 00:22:33.950 "product_name": "Malloc disk", 00:22:33.950 "block_size": 512, 00:22:33.950 "num_blocks": 65536, 00:22:33.950 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:33.950 "assigned_rate_limits": { 00:22:33.950 "rw_ios_per_sec": 0, 00:22:33.950 "rw_mbytes_per_sec": 0, 00:22:33.950 "r_mbytes_per_sec": 0, 00:22:33.950 "w_mbytes_per_sec": 0 00:22:33.950 }, 00:22:33.950 "claimed": false, 00:22:33.950 "zoned": false, 00:22:33.950 "supported_io_types": { 00:22:33.950 "read": true, 00:22:33.950 "write": true, 00:22:33.950 "unmap": true, 00:22:33.950 "flush": true, 00:22:33.950 "reset": true, 00:22:33.950 "nvme_admin": false, 00:22:33.950 "nvme_io": false, 00:22:33.950 "nvme_io_md": false, 00:22:33.950 "write_zeroes": true, 00:22:33.950 "zcopy": true, 00:22:33.950 "get_zone_info": false, 00:22:33.950 "zone_management": false, 00:22:33.950 "zone_append": false, 00:22:33.950 "compare": false, 00:22:33.950 "compare_and_write": false, 00:22:33.950 "abort": true, 00:22:33.950 "seek_hole": false, 00:22:33.950 "seek_data": false, 00:22:33.950 "copy": true, 00:22:33.950 "nvme_iov_md": false 00:22:33.950 }, 00:22:33.950 "memory_domains": [ 00:22:33.950 { 00:22:33.950 "dma_device_id": "system", 00:22:33.950 "dma_device_type": 1 00:22:33.950 }, 00:22:33.950 { 00:22:33.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.950 "dma_device_type": 2 00:22:33.950 } 00:22:33.950 ], 00:22:33.950 "driver_specific": {} 00:22:33.950 } 00:22:33.950 ] 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:34.210 [2024-07-23 17:16:29.607274] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:34.210 [2024-07-23 17:16:29.607321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:34.210 [2024-07-23 17:16:29.607343] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:34.210 [2024-07-23 17:16:29.608710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:34.210 [2024-07-23 17:16:29.608752] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.210 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.468 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.728 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.728 "name": "Existed_Raid", 00:22:34.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.728 "strip_size_kb": 64, 00:22:34.728 "state": "configuring", 00:22:34.728 "raid_level": "raid0", 00:22:34.728 "superblock": false, 00:22:34.728 "num_base_bdevs": 4, 00:22:34.728 "num_base_bdevs_discovered": 3, 00:22:34.728 "num_base_bdevs_operational": 4, 00:22:34.728 "base_bdevs_list": [ 00:22:34.728 { 00:22:34.728 "name": "BaseBdev1", 00:22:34.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.728 "is_configured": false, 00:22:34.728 "data_offset": 0, 00:22:34.728 "data_size": 0 00:22:34.728 }, 00:22:34.728 { 00:22:34.728 "name": "BaseBdev2", 00:22:34.728 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:34.728 "is_configured": true, 00:22:34.728 "data_offset": 0, 00:22:34.728 "data_size": 65536 00:22:34.728 }, 00:22:34.728 { 00:22:34.728 "name": "BaseBdev3", 00:22:34.728 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:34.728 "is_configured": true, 00:22:34.728 "data_offset": 0, 00:22:34.728 "data_size": 65536 00:22:34.728 }, 00:22:34.728 { 00:22:34.728 "name": "BaseBdev4", 00:22:34.728 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:34.728 "is_configured": true, 00:22:34.728 "data_offset": 0, 00:22:34.728 "data_size": 65536 00:22:34.728 } 00:22:34.728 ] 00:22:34.728 }' 00:22:34.728 17:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.728 17:16:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:35.296 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:35.554 [2024-07-23 17:16:30.734348] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.554 17:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:35.813 17:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.813 "name": "Existed_Raid", 00:22:35.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.813 "strip_size_kb": 64, 00:22:35.813 "state": "configuring", 00:22:35.813 "raid_level": "raid0", 00:22:35.813 "superblock": false, 00:22:35.813 "num_base_bdevs": 4, 00:22:35.813 "num_base_bdevs_discovered": 2, 00:22:35.813 "num_base_bdevs_operational": 4, 00:22:35.813 "base_bdevs_list": [ 00:22:35.813 { 00:22:35.813 "name": "BaseBdev1", 00:22:35.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.813 "is_configured": false, 00:22:35.813 "data_offset": 0, 00:22:35.813 "data_size": 0 00:22:35.813 }, 00:22:35.813 { 00:22:35.813 "name": null, 00:22:35.813 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:35.813 "is_configured": false, 00:22:35.813 "data_offset": 0, 00:22:35.813 "data_size": 65536 00:22:35.813 }, 00:22:35.813 { 00:22:35.813 "name": "BaseBdev3", 00:22:35.813 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:35.813 "is_configured": true, 00:22:35.813 "data_offset": 0, 00:22:35.813 "data_size": 65536 00:22:35.813 }, 00:22:35.813 { 00:22:35.813 "name": "BaseBdev4", 00:22:35.813 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:35.813 "is_configured": true, 00:22:35.813 "data_offset": 0, 00:22:35.813 "data_size": 65536 00:22:35.813 } 00:22:35.813 ] 00:22:35.813 }' 00:22:35.813 17:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.813 17:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:36.380 17:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.380 17:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:36.639 17:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:36.639 17:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:36.898 [2024-07-23 17:16:32.085340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:36.898 BaseBdev1 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:36.898 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:37.194 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:37.194 [ 00:22:37.194 { 00:22:37.194 "name": "BaseBdev1", 00:22:37.194 "aliases": [ 00:22:37.194 "6c77de50-bd5b-412d-a90f-bb39ff31ba4c" 00:22:37.194 ], 00:22:37.194 "product_name": "Malloc disk", 00:22:37.194 "block_size": 512, 00:22:37.194 "num_blocks": 65536, 00:22:37.194 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:37.194 "assigned_rate_limits": { 00:22:37.194 "rw_ios_per_sec": 0, 00:22:37.194 "rw_mbytes_per_sec": 0, 00:22:37.194 "r_mbytes_per_sec": 0, 00:22:37.194 "w_mbytes_per_sec": 0 00:22:37.194 }, 00:22:37.194 "claimed": true, 00:22:37.194 "claim_type": "exclusive_write", 00:22:37.194 "zoned": false, 00:22:37.194 "supported_io_types": { 00:22:37.194 "read": true, 00:22:37.194 "write": true, 00:22:37.194 "unmap": true, 00:22:37.194 "flush": true, 00:22:37.194 "reset": true, 00:22:37.194 "nvme_admin": false, 00:22:37.194 "nvme_io": false, 00:22:37.194 "nvme_io_md": false, 00:22:37.194 "write_zeroes": true, 00:22:37.194 "zcopy": true, 00:22:37.194 "get_zone_info": false, 00:22:37.194 "zone_management": false, 00:22:37.194 "zone_append": false, 00:22:37.194 "compare": false, 00:22:37.194 "compare_and_write": false, 00:22:37.194 "abort": true, 00:22:37.194 "seek_hole": false, 00:22:37.194 "seek_data": false, 00:22:37.194 "copy": true, 00:22:37.194 "nvme_iov_md": false 00:22:37.194 }, 00:22:37.194 "memory_domains": [ 00:22:37.194 { 00:22:37.194 "dma_device_id": "system", 00:22:37.194 "dma_device_type": 1 00:22:37.194 }, 00:22:37.194 { 00:22:37.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.194 "dma_device_type": 2 00:22:37.194 } 00:22:37.194 ], 00:22:37.194 "driver_specific": {} 00:22:37.194 } 00:22:37.194 ] 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.455 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.455 "name": "Existed_Raid", 00:22:37.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.455 "strip_size_kb": 64, 00:22:37.455 "state": "configuring", 00:22:37.455 "raid_level": "raid0", 00:22:37.455 "superblock": false, 00:22:37.455 "num_base_bdevs": 4, 00:22:37.455 "num_base_bdevs_discovered": 3, 00:22:37.455 "num_base_bdevs_operational": 4, 00:22:37.455 "base_bdevs_list": [ 00:22:37.455 { 00:22:37.455 "name": "BaseBdev1", 00:22:37.455 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:37.456 "is_configured": true, 00:22:37.456 "data_offset": 0, 00:22:37.456 "data_size": 65536 00:22:37.456 }, 00:22:37.456 { 00:22:37.456 "name": null, 00:22:37.456 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:37.456 "is_configured": false, 00:22:37.456 "data_offset": 0, 00:22:37.456 "data_size": 65536 00:22:37.456 }, 00:22:37.456 { 00:22:37.456 "name": "BaseBdev3", 00:22:37.456 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:37.456 "is_configured": true, 00:22:37.456 "data_offset": 0, 00:22:37.456 "data_size": 65536 00:22:37.456 }, 00:22:37.456 { 00:22:37.456 "name": "BaseBdev4", 00:22:37.456 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:37.456 "is_configured": true, 00:22:37.456 "data_offset": 0, 00:22:37.456 "data_size": 65536 00:22:37.456 } 00:22:37.456 ] 00:22:37.456 }' 00:22:37.456 17:16:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.456 17:16:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.026 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.026 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:38.284 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:38.284 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:38.542 [2024-07-23 17:16:33.850074] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.542 17:16:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:38.801 17:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.801 "name": "Existed_Raid", 00:22:38.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.801 "strip_size_kb": 64, 00:22:38.801 "state": "configuring", 00:22:38.801 "raid_level": "raid0", 00:22:38.801 "superblock": false, 00:22:38.801 "num_base_bdevs": 4, 00:22:38.801 "num_base_bdevs_discovered": 2, 00:22:38.801 "num_base_bdevs_operational": 4, 00:22:38.801 "base_bdevs_list": [ 00:22:38.801 { 00:22:38.801 "name": "BaseBdev1", 00:22:38.801 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:38.801 "is_configured": true, 00:22:38.801 "data_offset": 0, 00:22:38.801 "data_size": 65536 00:22:38.801 }, 00:22:38.801 { 00:22:38.801 "name": null, 00:22:38.801 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:38.801 "is_configured": false, 00:22:38.801 "data_offset": 0, 00:22:38.801 "data_size": 65536 00:22:38.801 }, 00:22:38.801 { 00:22:38.801 "name": null, 00:22:38.801 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:38.801 "is_configured": false, 00:22:38.801 "data_offset": 0, 00:22:38.801 "data_size": 65536 00:22:38.801 }, 00:22:38.801 { 00:22:38.801 "name": "BaseBdev4", 00:22:38.801 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:38.801 "is_configured": true, 00:22:38.801 "data_offset": 0, 00:22:38.801 "data_size": 65536 00:22:38.801 } 00:22:38.801 ] 00:22:38.801 }' 00:22:38.801 17:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.801 17:16:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:39.368 17:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.368 17:16:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:39.626 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:39.626 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:39.885 [2024-07-23 17:16:35.241797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.885 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:40.144 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.144 "name": "Existed_Raid", 00:22:40.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.144 "strip_size_kb": 64, 00:22:40.144 "state": "configuring", 00:22:40.144 "raid_level": "raid0", 00:22:40.144 "superblock": false, 00:22:40.144 "num_base_bdevs": 4, 00:22:40.144 "num_base_bdevs_discovered": 3, 00:22:40.144 "num_base_bdevs_operational": 4, 00:22:40.144 "base_bdevs_list": [ 00:22:40.144 { 00:22:40.144 "name": "BaseBdev1", 00:22:40.144 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:40.144 "is_configured": true, 00:22:40.144 "data_offset": 0, 00:22:40.144 "data_size": 65536 00:22:40.144 }, 00:22:40.144 { 00:22:40.144 "name": null, 00:22:40.144 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:40.144 "is_configured": false, 00:22:40.144 "data_offset": 0, 00:22:40.144 "data_size": 65536 00:22:40.144 }, 00:22:40.144 { 00:22:40.144 "name": "BaseBdev3", 00:22:40.144 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:40.144 "is_configured": true, 00:22:40.144 "data_offset": 0, 00:22:40.144 "data_size": 65536 00:22:40.144 }, 00:22:40.144 { 00:22:40.144 "name": "BaseBdev4", 00:22:40.144 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:40.144 "is_configured": true, 00:22:40.144 "data_offset": 0, 00:22:40.144 "data_size": 65536 00:22:40.144 } 00:22:40.144 ] 00:22:40.144 }' 00:22:40.144 17:16:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.144 17:16:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:41.078 17:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.078 17:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:41.337 17:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:41.337 17:16:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:41.905 [2024-07-23 17:16:37.026536] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.905 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:42.473 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.473 "name": "Existed_Raid", 00:22:42.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.473 "strip_size_kb": 64, 00:22:42.473 "state": "configuring", 00:22:42.473 "raid_level": "raid0", 00:22:42.473 "superblock": false, 00:22:42.473 "num_base_bdevs": 4, 00:22:42.473 "num_base_bdevs_discovered": 2, 00:22:42.473 "num_base_bdevs_operational": 4, 00:22:42.473 "base_bdevs_list": [ 00:22:42.473 { 00:22:42.473 "name": null, 00:22:42.473 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:42.473 "is_configured": false, 00:22:42.473 "data_offset": 0, 00:22:42.473 "data_size": 65536 00:22:42.473 }, 00:22:42.473 { 00:22:42.473 "name": null, 00:22:42.473 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:42.473 "is_configured": false, 00:22:42.473 "data_offset": 0, 00:22:42.473 "data_size": 65536 00:22:42.473 }, 00:22:42.473 { 00:22:42.473 "name": "BaseBdev3", 00:22:42.473 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:42.473 "is_configured": true, 00:22:42.473 "data_offset": 0, 00:22:42.473 "data_size": 65536 00:22:42.473 }, 00:22:42.473 { 00:22:42.473 "name": "BaseBdev4", 00:22:42.473 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:42.473 "is_configured": true, 00:22:42.473 "data_offset": 0, 00:22:42.473 "data_size": 65536 00:22:42.473 } 00:22:42.473 ] 00:22:42.473 }' 00:22:42.473 17:16:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.473 17:16:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:43.415 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:43.415 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.415 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:43.415 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:43.675 [2024-07-23 17:16:38.951988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.675 17:16:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:43.933 17:16:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:43.933 "name": "Existed_Raid", 00:22:43.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.933 "strip_size_kb": 64, 00:22:43.933 "state": "configuring", 00:22:43.933 "raid_level": "raid0", 00:22:43.933 "superblock": false, 00:22:43.933 "num_base_bdevs": 4, 00:22:43.933 "num_base_bdevs_discovered": 3, 00:22:43.933 "num_base_bdevs_operational": 4, 00:22:43.933 "base_bdevs_list": [ 00:22:43.933 { 00:22:43.933 "name": null, 00:22:43.933 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:43.933 "is_configured": false, 00:22:43.933 "data_offset": 0, 00:22:43.933 "data_size": 65536 00:22:43.933 }, 00:22:43.933 { 00:22:43.933 "name": "BaseBdev2", 00:22:43.933 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:43.933 "is_configured": true, 00:22:43.933 "data_offset": 0, 00:22:43.933 "data_size": 65536 00:22:43.933 }, 00:22:43.933 { 00:22:43.933 "name": "BaseBdev3", 00:22:43.933 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:43.933 "is_configured": true, 00:22:43.933 "data_offset": 0, 00:22:43.933 "data_size": 65536 00:22:43.933 }, 00:22:43.933 { 00:22:43.933 "name": "BaseBdev4", 00:22:43.933 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:43.933 "is_configured": true, 00:22:43.933 "data_offset": 0, 00:22:43.933 "data_size": 65536 00:22:43.933 } 00:22:43.933 ] 00:22:43.933 }' 00:22:43.933 17:16:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:43.933 17:16:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:44.501 17:16:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.501 17:16:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:44.759 17:16:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:44.759 17:16:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.759 17:16:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:45.018 17:16:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6c77de50-bd5b-412d-a90f-bb39ff31ba4c 00:22:45.277 [2024-07-23 17:16:40.556765] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:45.277 [2024-07-23 17:16:40.556808] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23eb390 00:22:45.277 [2024-07-23 17:16:40.556816] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:22:45.277 [2024-07-23 17:16:40.557035] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ee060 00:22:45.277 [2024-07-23 17:16:40.557156] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23eb390 00:22:45.277 [2024-07-23 17:16:40.557166] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23eb390 00:22:45.277 [2024-07-23 17:16:40.557330] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:45.277 NewBaseBdev 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:45.277 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:45.536 17:16:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:45.795 [ 00:22:45.795 { 00:22:45.795 "name": "NewBaseBdev", 00:22:45.795 "aliases": [ 00:22:45.795 "6c77de50-bd5b-412d-a90f-bb39ff31ba4c" 00:22:45.795 ], 00:22:45.795 "product_name": "Malloc disk", 00:22:45.795 "block_size": 512, 00:22:45.795 "num_blocks": 65536, 00:22:45.795 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:45.795 "assigned_rate_limits": { 00:22:45.795 "rw_ios_per_sec": 0, 00:22:45.795 "rw_mbytes_per_sec": 0, 00:22:45.795 "r_mbytes_per_sec": 0, 00:22:45.795 "w_mbytes_per_sec": 0 00:22:45.795 }, 00:22:45.795 "claimed": true, 00:22:45.795 "claim_type": "exclusive_write", 00:22:45.795 "zoned": false, 00:22:45.795 "supported_io_types": { 00:22:45.795 "read": true, 00:22:45.795 "write": true, 00:22:45.795 "unmap": true, 00:22:45.795 "flush": true, 00:22:45.795 "reset": true, 00:22:45.795 "nvme_admin": false, 00:22:45.795 "nvme_io": false, 00:22:45.795 "nvme_io_md": false, 00:22:45.795 "write_zeroes": true, 00:22:45.795 "zcopy": true, 00:22:45.795 "get_zone_info": false, 00:22:45.795 "zone_management": false, 00:22:45.795 "zone_append": false, 00:22:45.795 "compare": false, 00:22:45.795 "compare_and_write": false, 00:22:45.795 "abort": true, 00:22:45.795 "seek_hole": false, 00:22:45.795 "seek_data": false, 00:22:45.795 "copy": true, 00:22:45.795 "nvme_iov_md": false 00:22:45.795 }, 00:22:45.795 "memory_domains": [ 00:22:45.795 { 00:22:45.795 "dma_device_id": "system", 00:22:45.795 "dma_device_type": 1 00:22:45.795 }, 00:22:45.795 { 00:22:45.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.795 "dma_device_type": 2 00:22:45.795 } 00:22:45.795 ], 00:22:45.795 "driver_specific": {} 00:22:45.795 } 00:22:45.795 ] 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.795 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:46.053 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.053 "name": "Existed_Raid", 00:22:46.053 "uuid": "0ab1e8af-c1fc-42df-84da-07cdeaf009c2", 00:22:46.053 "strip_size_kb": 64, 00:22:46.053 "state": "online", 00:22:46.053 "raid_level": "raid0", 00:22:46.053 "superblock": false, 00:22:46.053 "num_base_bdevs": 4, 00:22:46.053 "num_base_bdevs_discovered": 4, 00:22:46.053 "num_base_bdevs_operational": 4, 00:22:46.053 "base_bdevs_list": [ 00:22:46.053 { 00:22:46.053 "name": "NewBaseBdev", 00:22:46.053 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:46.053 "is_configured": true, 00:22:46.054 "data_offset": 0, 00:22:46.054 "data_size": 65536 00:22:46.054 }, 00:22:46.054 { 00:22:46.054 "name": "BaseBdev2", 00:22:46.054 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:46.054 "is_configured": true, 00:22:46.054 "data_offset": 0, 00:22:46.054 "data_size": 65536 00:22:46.054 }, 00:22:46.054 { 00:22:46.054 "name": "BaseBdev3", 00:22:46.054 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:46.054 "is_configured": true, 00:22:46.054 "data_offset": 0, 00:22:46.054 "data_size": 65536 00:22:46.054 }, 00:22:46.054 { 00:22:46.054 "name": "BaseBdev4", 00:22:46.054 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:46.054 "is_configured": true, 00:22:46.054 "data_offset": 0, 00:22:46.054 "data_size": 65536 00:22:46.054 } 00:22:46.054 ] 00:22:46.054 }' 00:22:46.054 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.054 17:16:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:46.619 17:16:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:46.878 [2024-07-23 17:16:42.221503] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:46.878 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:46.878 "name": "Existed_Raid", 00:22:46.878 "aliases": [ 00:22:46.878 "0ab1e8af-c1fc-42df-84da-07cdeaf009c2" 00:22:46.878 ], 00:22:46.878 "product_name": "Raid Volume", 00:22:46.878 "block_size": 512, 00:22:46.878 "num_blocks": 262144, 00:22:46.878 "uuid": "0ab1e8af-c1fc-42df-84da-07cdeaf009c2", 00:22:46.878 "assigned_rate_limits": { 00:22:46.878 "rw_ios_per_sec": 0, 00:22:46.878 "rw_mbytes_per_sec": 0, 00:22:46.878 "r_mbytes_per_sec": 0, 00:22:46.878 "w_mbytes_per_sec": 0 00:22:46.878 }, 00:22:46.878 "claimed": false, 00:22:46.878 "zoned": false, 00:22:46.878 "supported_io_types": { 00:22:46.878 "read": true, 00:22:46.878 "write": true, 00:22:46.878 "unmap": true, 00:22:46.878 "flush": true, 00:22:46.878 "reset": true, 00:22:46.878 "nvme_admin": false, 00:22:46.878 "nvme_io": false, 00:22:46.878 "nvme_io_md": false, 00:22:46.878 "write_zeroes": true, 00:22:46.878 "zcopy": false, 00:22:46.878 "get_zone_info": false, 00:22:46.878 "zone_management": false, 00:22:46.878 "zone_append": false, 00:22:46.878 "compare": false, 00:22:46.878 "compare_and_write": false, 00:22:46.878 "abort": false, 00:22:46.878 "seek_hole": false, 00:22:46.878 "seek_data": false, 00:22:46.878 "copy": false, 00:22:46.878 "nvme_iov_md": false 00:22:46.878 }, 00:22:46.878 "memory_domains": [ 00:22:46.878 { 00:22:46.878 "dma_device_id": "system", 00:22:46.878 "dma_device_type": 1 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.878 "dma_device_type": 2 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "system", 00:22:46.878 "dma_device_type": 1 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.878 "dma_device_type": 2 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "system", 00:22:46.878 "dma_device_type": 1 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.878 "dma_device_type": 2 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "system", 00:22:46.878 "dma_device_type": 1 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.878 "dma_device_type": 2 00:22:46.878 } 00:22:46.878 ], 00:22:46.878 "driver_specific": { 00:22:46.878 "raid": { 00:22:46.878 "uuid": "0ab1e8af-c1fc-42df-84da-07cdeaf009c2", 00:22:46.878 "strip_size_kb": 64, 00:22:46.878 "state": "online", 00:22:46.878 "raid_level": "raid0", 00:22:46.878 "superblock": false, 00:22:46.878 "num_base_bdevs": 4, 00:22:46.878 "num_base_bdevs_discovered": 4, 00:22:46.878 "num_base_bdevs_operational": 4, 00:22:46.878 "base_bdevs_list": [ 00:22:46.878 { 00:22:46.878 "name": "NewBaseBdev", 00:22:46.878 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:46.878 "is_configured": true, 00:22:46.878 "data_offset": 0, 00:22:46.878 "data_size": 65536 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "name": "BaseBdev2", 00:22:46.878 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:46.878 "is_configured": true, 00:22:46.878 "data_offset": 0, 00:22:46.878 "data_size": 65536 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "name": "BaseBdev3", 00:22:46.878 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:46.878 "is_configured": true, 00:22:46.878 "data_offset": 0, 00:22:46.878 "data_size": 65536 00:22:46.878 }, 00:22:46.878 { 00:22:46.878 "name": "BaseBdev4", 00:22:46.878 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:46.878 "is_configured": true, 00:22:46.878 "data_offset": 0, 00:22:46.878 "data_size": 65536 00:22:46.878 } 00:22:46.878 ] 00:22:46.878 } 00:22:46.878 } 00:22:46.878 }' 00:22:46.878 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:46.878 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:46.878 BaseBdev2 00:22:46.878 BaseBdev3 00:22:46.878 BaseBdev4' 00:22:46.878 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:46.878 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:46.878 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:47.445 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:47.445 "name": "NewBaseBdev", 00:22:47.445 "aliases": [ 00:22:47.445 "6c77de50-bd5b-412d-a90f-bb39ff31ba4c" 00:22:47.445 ], 00:22:47.445 "product_name": "Malloc disk", 00:22:47.445 "block_size": 512, 00:22:47.445 "num_blocks": 65536, 00:22:47.445 "uuid": "6c77de50-bd5b-412d-a90f-bb39ff31ba4c", 00:22:47.445 "assigned_rate_limits": { 00:22:47.445 "rw_ios_per_sec": 0, 00:22:47.445 "rw_mbytes_per_sec": 0, 00:22:47.445 "r_mbytes_per_sec": 0, 00:22:47.445 "w_mbytes_per_sec": 0 00:22:47.445 }, 00:22:47.445 "claimed": true, 00:22:47.445 "claim_type": "exclusive_write", 00:22:47.445 "zoned": false, 00:22:47.445 "supported_io_types": { 00:22:47.445 "read": true, 00:22:47.445 "write": true, 00:22:47.445 "unmap": true, 00:22:47.445 "flush": true, 00:22:47.445 "reset": true, 00:22:47.445 "nvme_admin": false, 00:22:47.445 "nvme_io": false, 00:22:47.445 "nvme_io_md": false, 00:22:47.445 "write_zeroes": true, 00:22:47.445 "zcopy": true, 00:22:47.445 "get_zone_info": false, 00:22:47.445 "zone_management": false, 00:22:47.445 "zone_append": false, 00:22:47.445 "compare": false, 00:22:47.445 "compare_and_write": false, 00:22:47.445 "abort": true, 00:22:47.445 "seek_hole": false, 00:22:47.445 "seek_data": false, 00:22:47.445 "copy": true, 00:22:47.445 "nvme_iov_md": false 00:22:47.445 }, 00:22:47.445 "memory_domains": [ 00:22:47.445 { 00:22:47.445 "dma_device_id": "system", 00:22:47.445 "dma_device_type": 1 00:22:47.445 }, 00:22:47.445 { 00:22:47.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.445 "dma_device_type": 2 00:22:47.445 } 00:22:47.445 ], 00:22:47.445 "driver_specific": {} 00:22:47.445 }' 00:22:47.445 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:47.703 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:47.703 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:47.703 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:47.703 17:16:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:47.703 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:47.703 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:47.703 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:47.703 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:47.703 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:47.961 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:47.961 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:47.961 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:47.961 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:47.961 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:48.219 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:48.219 "name": "BaseBdev2", 00:22:48.219 "aliases": [ 00:22:48.219 "095580b5-8689-4036-b34a-130d01f38561" 00:22:48.219 ], 00:22:48.219 "product_name": "Malloc disk", 00:22:48.219 "block_size": 512, 00:22:48.219 "num_blocks": 65536, 00:22:48.219 "uuid": "095580b5-8689-4036-b34a-130d01f38561", 00:22:48.219 "assigned_rate_limits": { 00:22:48.219 "rw_ios_per_sec": 0, 00:22:48.219 "rw_mbytes_per_sec": 0, 00:22:48.219 "r_mbytes_per_sec": 0, 00:22:48.219 "w_mbytes_per_sec": 0 00:22:48.219 }, 00:22:48.219 "claimed": true, 00:22:48.219 "claim_type": "exclusive_write", 00:22:48.219 "zoned": false, 00:22:48.219 "supported_io_types": { 00:22:48.219 "read": true, 00:22:48.219 "write": true, 00:22:48.219 "unmap": true, 00:22:48.219 "flush": true, 00:22:48.219 "reset": true, 00:22:48.219 "nvme_admin": false, 00:22:48.219 "nvme_io": false, 00:22:48.219 "nvme_io_md": false, 00:22:48.219 "write_zeroes": true, 00:22:48.219 "zcopy": true, 00:22:48.219 "get_zone_info": false, 00:22:48.219 "zone_management": false, 00:22:48.219 "zone_append": false, 00:22:48.219 "compare": false, 00:22:48.219 "compare_and_write": false, 00:22:48.219 "abort": true, 00:22:48.219 "seek_hole": false, 00:22:48.219 "seek_data": false, 00:22:48.219 "copy": true, 00:22:48.219 "nvme_iov_md": false 00:22:48.219 }, 00:22:48.219 "memory_domains": [ 00:22:48.219 { 00:22:48.219 "dma_device_id": "system", 00:22:48.219 "dma_device_type": 1 00:22:48.219 }, 00:22:48.219 { 00:22:48.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.219 "dma_device_type": 2 00:22:48.219 } 00:22:48.219 ], 00:22:48.219 "driver_specific": {} 00:22:48.219 }' 00:22:48.219 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.219 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.219 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:48.219 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:48.219 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:48.477 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:48.477 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:48.477 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:48.477 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:48.477 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:48.477 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:48.736 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:48.736 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:48.736 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:48.736 17:16:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:48.993 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:48.993 "name": "BaseBdev3", 00:22:48.993 "aliases": [ 00:22:48.993 "884df540-1a2b-453c-b871-3ec3999e321a" 00:22:48.993 ], 00:22:48.993 "product_name": "Malloc disk", 00:22:48.993 "block_size": 512, 00:22:48.993 "num_blocks": 65536, 00:22:48.993 "uuid": "884df540-1a2b-453c-b871-3ec3999e321a", 00:22:48.993 "assigned_rate_limits": { 00:22:48.993 "rw_ios_per_sec": 0, 00:22:48.993 "rw_mbytes_per_sec": 0, 00:22:48.993 "r_mbytes_per_sec": 0, 00:22:48.993 "w_mbytes_per_sec": 0 00:22:48.993 }, 00:22:48.993 "claimed": true, 00:22:48.993 "claim_type": "exclusive_write", 00:22:48.993 "zoned": false, 00:22:48.993 "supported_io_types": { 00:22:48.993 "read": true, 00:22:48.993 "write": true, 00:22:48.993 "unmap": true, 00:22:48.993 "flush": true, 00:22:48.993 "reset": true, 00:22:48.993 "nvme_admin": false, 00:22:48.993 "nvme_io": false, 00:22:48.993 "nvme_io_md": false, 00:22:48.993 "write_zeroes": true, 00:22:48.993 "zcopy": true, 00:22:48.993 "get_zone_info": false, 00:22:48.993 "zone_management": false, 00:22:48.993 "zone_append": false, 00:22:48.993 "compare": false, 00:22:48.993 "compare_and_write": false, 00:22:48.993 "abort": true, 00:22:48.993 "seek_hole": false, 00:22:48.993 "seek_data": false, 00:22:48.993 "copy": true, 00:22:48.993 "nvme_iov_md": false 00:22:48.993 }, 00:22:48.994 "memory_domains": [ 00:22:48.994 { 00:22:48.994 "dma_device_id": "system", 00:22:48.994 "dma_device_type": 1 00:22:48.994 }, 00:22:48.994 { 00:22:48.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.994 "dma_device_type": 2 00:22:48.994 } 00:22:48.994 ], 00:22:48.994 "driver_specific": {} 00:22:48.994 }' 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:48.994 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:49.252 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:49.511 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:49.511 "name": "BaseBdev4", 00:22:49.511 "aliases": [ 00:22:49.511 "e234e45b-8cbd-44e8-b39b-d0af31251085" 00:22:49.511 ], 00:22:49.511 "product_name": "Malloc disk", 00:22:49.511 "block_size": 512, 00:22:49.511 "num_blocks": 65536, 00:22:49.511 "uuid": "e234e45b-8cbd-44e8-b39b-d0af31251085", 00:22:49.511 "assigned_rate_limits": { 00:22:49.511 "rw_ios_per_sec": 0, 00:22:49.511 "rw_mbytes_per_sec": 0, 00:22:49.511 "r_mbytes_per_sec": 0, 00:22:49.511 "w_mbytes_per_sec": 0 00:22:49.511 }, 00:22:49.511 "claimed": true, 00:22:49.511 "claim_type": "exclusive_write", 00:22:49.511 "zoned": false, 00:22:49.511 "supported_io_types": { 00:22:49.511 "read": true, 00:22:49.511 "write": true, 00:22:49.511 "unmap": true, 00:22:49.511 "flush": true, 00:22:49.511 "reset": true, 00:22:49.511 "nvme_admin": false, 00:22:49.511 "nvme_io": false, 00:22:49.511 "nvme_io_md": false, 00:22:49.511 "write_zeroes": true, 00:22:49.511 "zcopy": true, 00:22:49.511 "get_zone_info": false, 00:22:49.511 "zone_management": false, 00:22:49.511 "zone_append": false, 00:22:49.511 "compare": false, 00:22:49.511 "compare_and_write": false, 00:22:49.511 "abort": true, 00:22:49.511 "seek_hole": false, 00:22:49.511 "seek_data": false, 00:22:49.511 "copy": true, 00:22:49.511 "nvme_iov_md": false 00:22:49.511 }, 00:22:49.511 "memory_domains": [ 00:22:49.511 { 00:22:49.511 "dma_device_id": "system", 00:22:49.511 "dma_device_type": 1 00:22:49.511 }, 00:22:49.511 { 00:22:49.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:49.511 "dma_device_type": 2 00:22:49.511 } 00:22:49.511 ], 00:22:49.511 "driver_specific": {} 00:22:49.511 }' 00:22:49.511 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.511 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.769 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.769 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.769 17:16:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.769 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.769 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.769 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.769 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.769 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.769 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.027 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:50.027 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:50.285 [2024-07-23 17:16:45.678380] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:50.285 [2024-07-23 17:16:45.678409] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:50.285 [2024-07-23 17:16:45.678466] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.285 [2024-07-23 17:16:45.678526] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.285 [2024-07-23 17:16:45.678538] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23eb390 name Existed_Raid, state offline 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4171648 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4171648 ']' 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4171648 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4171648 00:22:50.543 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:50.544 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:50.544 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4171648' 00:22:50.544 killing process with pid 4171648 00:22:50.544 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4171648 00:22:50.544 [2024-07-23 17:16:45.759227] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:50.544 17:16:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4171648 00:22:50.544 [2024-07-23 17:16:45.802058] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:50.802 00:22:50.802 real 0m35.236s 00:22:50.802 user 1m4.943s 00:22:50.802 sys 0m6.087s 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.802 ************************************ 00:22:50.802 END TEST raid_state_function_test 00:22:50.802 ************************************ 00:22:50.802 17:16:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:50.802 17:16:46 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:22:50.802 17:16:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:50.802 17:16:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:50.802 17:16:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:50.802 ************************************ 00:22:50.802 START TEST raid_state_function_test_sb 00:22:50.802 ************************************ 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4176876 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4176876' 00:22:50.802 Process raid pid: 4176876 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4176876 /var/tmp/spdk-raid.sock 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4176876 ']' 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:50.802 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:50.803 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:50.803 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:50.803 17:16:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:50.803 [2024-07-23 17:16:46.180598] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:22:50.803 [2024-07-23 17:16:46.180665] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:51.061 [2024-07-23 17:16:46.313439] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:51.061 [2024-07-23 17:16:46.369258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:51.061 [2024-07-23 17:16:46.435714] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.061 [2024-07-23 17:16:46.435749] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.997 17:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:51.997 17:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:51.997 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:52.563 [2024-07-23 17:16:47.865834] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:52.563 [2024-07-23 17:16:47.865875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:52.563 [2024-07-23 17:16:47.865885] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:52.563 [2024-07-23 17:16:47.865903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:52.563 [2024-07-23 17:16:47.865912] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:52.563 [2024-07-23 17:16:47.865923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:52.563 [2024-07-23 17:16:47.865932] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:52.563 [2024-07-23 17:16:47.865943] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.563 17:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:52.822 17:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.822 "name": "Existed_Raid", 00:22:52.822 "uuid": "336455a8-face-4fe2-b50a-eff50ce65be0", 00:22:52.822 "strip_size_kb": 64, 00:22:52.822 "state": "configuring", 00:22:52.822 "raid_level": "raid0", 00:22:52.822 "superblock": true, 00:22:52.822 "num_base_bdevs": 4, 00:22:52.822 "num_base_bdevs_discovered": 0, 00:22:52.822 "num_base_bdevs_operational": 4, 00:22:52.822 "base_bdevs_list": [ 00:22:52.822 { 00:22:52.822 "name": "BaseBdev1", 00:22:52.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.822 "is_configured": false, 00:22:52.822 "data_offset": 0, 00:22:52.822 "data_size": 0 00:22:52.822 }, 00:22:52.822 { 00:22:52.822 "name": "BaseBdev2", 00:22:52.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.822 "is_configured": false, 00:22:52.822 "data_offset": 0, 00:22:52.822 "data_size": 0 00:22:52.822 }, 00:22:52.822 { 00:22:52.822 "name": "BaseBdev3", 00:22:52.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.822 "is_configured": false, 00:22:52.822 "data_offset": 0, 00:22:52.822 "data_size": 0 00:22:52.822 }, 00:22:52.822 { 00:22:52.822 "name": "BaseBdev4", 00:22:52.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.822 "is_configured": false, 00:22:52.822 "data_offset": 0, 00:22:52.822 "data_size": 0 00:22:52.822 } 00:22:52.822 ] 00:22:52.822 }' 00:22:52.822 17:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.822 17:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:53.391 17:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:53.958 [2024-07-23 17:16:49.209197] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:53.959 [2024-07-23 17:16:49.209227] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7ac430 name Existed_Raid, state configuring 00:22:53.959 17:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:54.553 [2024-07-23 17:16:49.722578] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:54.553 [2024-07-23 17:16:49.722611] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:54.553 [2024-07-23 17:16:49.722622] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:54.553 [2024-07-23 17:16:49.722633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:54.553 [2024-07-23 17:16:49.722641] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:54.553 [2024-07-23 17:16:49.722652] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:54.553 [2024-07-23 17:16:49.722665] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:54.553 [2024-07-23 17:16:49.722677] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:54.553 17:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:55.129 [2024-07-23 17:16:50.266775] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:55.129 BaseBdev1 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:55.129 17:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:55.697 [ 00:22:55.697 { 00:22:55.697 "name": "BaseBdev1", 00:22:55.697 "aliases": [ 00:22:55.697 "01755f35-7535-4eaf-b568-9ffbad354041" 00:22:55.697 ], 00:22:55.697 "product_name": "Malloc disk", 00:22:55.697 "block_size": 512, 00:22:55.697 "num_blocks": 65536, 00:22:55.697 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:22:55.697 "assigned_rate_limits": { 00:22:55.697 "rw_ios_per_sec": 0, 00:22:55.697 "rw_mbytes_per_sec": 0, 00:22:55.697 "r_mbytes_per_sec": 0, 00:22:55.697 "w_mbytes_per_sec": 0 00:22:55.697 }, 00:22:55.697 "claimed": true, 00:22:55.697 "claim_type": "exclusive_write", 00:22:55.697 "zoned": false, 00:22:55.697 "supported_io_types": { 00:22:55.697 "read": true, 00:22:55.697 "write": true, 00:22:55.697 "unmap": true, 00:22:55.697 "flush": true, 00:22:55.697 "reset": true, 00:22:55.697 "nvme_admin": false, 00:22:55.697 "nvme_io": false, 00:22:55.697 "nvme_io_md": false, 00:22:55.697 "write_zeroes": true, 00:22:55.697 "zcopy": true, 00:22:55.697 "get_zone_info": false, 00:22:55.697 "zone_management": false, 00:22:55.697 "zone_append": false, 00:22:55.697 "compare": false, 00:22:55.697 "compare_and_write": false, 00:22:55.697 "abort": true, 00:22:55.697 "seek_hole": false, 00:22:55.697 "seek_data": false, 00:22:55.697 "copy": true, 00:22:55.697 "nvme_iov_md": false 00:22:55.697 }, 00:22:55.697 "memory_domains": [ 00:22:55.697 { 00:22:55.697 "dma_device_id": "system", 00:22:55.697 "dma_device_type": 1 00:22:55.697 }, 00:22:55.697 { 00:22:55.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.697 "dma_device_type": 2 00:22:55.697 } 00:22:55.697 ], 00:22:55.697 "driver_specific": {} 00:22:55.697 } 00:22:55.697 ] 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.697 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:56.265 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.265 "name": "Existed_Raid", 00:22:56.265 "uuid": "bc90a3c2-9c00-4fe9-86ed-f2626f82957f", 00:22:56.265 "strip_size_kb": 64, 00:22:56.265 "state": "configuring", 00:22:56.265 "raid_level": "raid0", 00:22:56.265 "superblock": true, 00:22:56.265 "num_base_bdevs": 4, 00:22:56.265 "num_base_bdevs_discovered": 1, 00:22:56.265 "num_base_bdevs_operational": 4, 00:22:56.265 "base_bdevs_list": [ 00:22:56.265 { 00:22:56.265 "name": "BaseBdev1", 00:22:56.265 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:22:56.265 "is_configured": true, 00:22:56.265 "data_offset": 2048, 00:22:56.265 "data_size": 63488 00:22:56.265 }, 00:22:56.265 { 00:22:56.265 "name": "BaseBdev2", 00:22:56.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.265 "is_configured": false, 00:22:56.265 "data_offset": 0, 00:22:56.265 "data_size": 0 00:22:56.265 }, 00:22:56.265 { 00:22:56.265 "name": "BaseBdev3", 00:22:56.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.265 "is_configured": false, 00:22:56.265 "data_offset": 0, 00:22:56.265 "data_size": 0 00:22:56.265 }, 00:22:56.265 { 00:22:56.265 "name": "BaseBdev4", 00:22:56.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.265 "is_configured": false, 00:22:56.265 "data_offset": 0, 00:22:56.265 "data_size": 0 00:22:56.265 } 00:22:56.265 ] 00:22:56.265 }' 00:22:56.265 17:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.265 17:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:57.200 17:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:57.767 [2024-07-23 17:16:52.925828] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:57.767 [2024-07-23 17:16:52.925870] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7abd60 name Existed_Raid, state configuring 00:22:57.767 17:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:58.025 [2024-07-23 17:16:53.439228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:58.025 [2024-07-23 17:16:53.440692] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:58.025 [2024-07-23 17:16:53.440724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:58.025 [2024-07-23 17:16:53.440734] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:58.025 [2024-07-23 17:16:53.440746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:58.025 [2024-07-23 17:16:53.440755] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:58.025 [2024-07-23 17:16:53.440766] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.283 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:58.850 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.850 "name": "Existed_Raid", 00:22:58.850 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:22:58.850 "strip_size_kb": 64, 00:22:58.850 "state": "configuring", 00:22:58.850 "raid_level": "raid0", 00:22:58.850 "superblock": true, 00:22:58.850 "num_base_bdevs": 4, 00:22:58.850 "num_base_bdevs_discovered": 1, 00:22:58.850 "num_base_bdevs_operational": 4, 00:22:58.850 "base_bdevs_list": [ 00:22:58.850 { 00:22:58.850 "name": "BaseBdev1", 00:22:58.850 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:22:58.850 "is_configured": true, 00:22:58.850 "data_offset": 2048, 00:22:58.850 "data_size": 63488 00:22:58.850 }, 00:22:58.850 { 00:22:58.850 "name": "BaseBdev2", 00:22:58.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.850 "is_configured": false, 00:22:58.850 "data_offset": 0, 00:22:58.850 "data_size": 0 00:22:58.850 }, 00:22:58.850 { 00:22:58.850 "name": "BaseBdev3", 00:22:58.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.850 "is_configured": false, 00:22:58.850 "data_offset": 0, 00:22:58.850 "data_size": 0 00:22:58.850 }, 00:22:58.850 { 00:22:58.850 "name": "BaseBdev4", 00:22:58.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.850 "is_configured": false, 00:22:58.850 "data_offset": 0, 00:22:58.850 "data_size": 0 00:22:58.850 } 00:22:58.850 ] 00:22:58.850 }' 00:22:58.850 17:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.850 17:16:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:59.416 17:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:59.675 [2024-07-23 17:16:55.076051] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:59.675 BaseBdev2 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:59.933 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:00.501 17:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:00.759 [ 00:23:00.759 { 00:23:00.759 "name": "BaseBdev2", 00:23:00.759 "aliases": [ 00:23:00.759 "9ee8f984-eb35-4583-8430-83a6173f225b" 00:23:00.759 ], 00:23:00.759 "product_name": "Malloc disk", 00:23:00.759 "block_size": 512, 00:23:00.759 "num_blocks": 65536, 00:23:00.759 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:00.759 "assigned_rate_limits": { 00:23:00.759 "rw_ios_per_sec": 0, 00:23:00.759 "rw_mbytes_per_sec": 0, 00:23:00.759 "r_mbytes_per_sec": 0, 00:23:00.759 "w_mbytes_per_sec": 0 00:23:00.759 }, 00:23:00.759 "claimed": true, 00:23:00.759 "claim_type": "exclusive_write", 00:23:00.759 "zoned": false, 00:23:00.759 "supported_io_types": { 00:23:00.759 "read": true, 00:23:00.759 "write": true, 00:23:00.759 "unmap": true, 00:23:00.759 "flush": true, 00:23:00.759 "reset": true, 00:23:00.759 "nvme_admin": false, 00:23:00.759 "nvme_io": false, 00:23:00.759 "nvme_io_md": false, 00:23:00.759 "write_zeroes": true, 00:23:00.759 "zcopy": true, 00:23:00.759 "get_zone_info": false, 00:23:00.759 "zone_management": false, 00:23:00.759 "zone_append": false, 00:23:00.759 "compare": false, 00:23:00.759 "compare_and_write": false, 00:23:00.759 "abort": true, 00:23:00.759 "seek_hole": false, 00:23:00.759 "seek_data": false, 00:23:00.759 "copy": true, 00:23:00.759 "nvme_iov_md": false 00:23:00.759 }, 00:23:00.759 "memory_domains": [ 00:23:00.759 { 00:23:00.759 "dma_device_id": "system", 00:23:00.759 "dma_device_type": 1 00:23:00.759 }, 00:23:00.759 { 00:23:00.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.759 "dma_device_type": 2 00:23:00.759 } 00:23:00.759 ], 00:23:00.759 "driver_specific": {} 00:23:00.759 } 00:23:00.759 ] 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.759 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:01.017 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.017 "name": "Existed_Raid", 00:23:01.017 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:23:01.017 "strip_size_kb": 64, 00:23:01.017 "state": "configuring", 00:23:01.017 "raid_level": "raid0", 00:23:01.017 "superblock": true, 00:23:01.017 "num_base_bdevs": 4, 00:23:01.017 "num_base_bdevs_discovered": 2, 00:23:01.017 "num_base_bdevs_operational": 4, 00:23:01.017 "base_bdevs_list": [ 00:23:01.017 { 00:23:01.017 "name": "BaseBdev1", 00:23:01.017 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:23:01.017 "is_configured": true, 00:23:01.017 "data_offset": 2048, 00:23:01.017 "data_size": 63488 00:23:01.017 }, 00:23:01.017 { 00:23:01.017 "name": "BaseBdev2", 00:23:01.017 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:01.017 "is_configured": true, 00:23:01.017 "data_offset": 2048, 00:23:01.017 "data_size": 63488 00:23:01.017 }, 00:23:01.017 { 00:23:01.017 "name": "BaseBdev3", 00:23:01.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.017 "is_configured": false, 00:23:01.017 "data_offset": 0, 00:23:01.017 "data_size": 0 00:23:01.017 }, 00:23:01.017 { 00:23:01.017 "name": "BaseBdev4", 00:23:01.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.018 "is_configured": false, 00:23:01.018 "data_offset": 0, 00:23:01.018 "data_size": 0 00:23:01.018 } 00:23:01.018 ] 00:23:01.018 }' 00:23:01.018 17:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.018 17:16:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:01.952 [2024-07-23 17:16:57.257271] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:01.952 BaseBdev3 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:01.952 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:02.210 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:02.468 [ 00:23:02.468 { 00:23:02.468 "name": "BaseBdev3", 00:23:02.468 "aliases": [ 00:23:02.468 "d0a50fd3-2110-41b6-9f86-3785140ea1b6" 00:23:02.468 ], 00:23:02.468 "product_name": "Malloc disk", 00:23:02.468 "block_size": 512, 00:23:02.468 "num_blocks": 65536, 00:23:02.468 "uuid": "d0a50fd3-2110-41b6-9f86-3785140ea1b6", 00:23:02.468 "assigned_rate_limits": { 00:23:02.468 "rw_ios_per_sec": 0, 00:23:02.468 "rw_mbytes_per_sec": 0, 00:23:02.468 "r_mbytes_per_sec": 0, 00:23:02.468 "w_mbytes_per_sec": 0 00:23:02.468 }, 00:23:02.468 "claimed": true, 00:23:02.468 "claim_type": "exclusive_write", 00:23:02.468 "zoned": false, 00:23:02.468 "supported_io_types": { 00:23:02.468 "read": true, 00:23:02.468 "write": true, 00:23:02.469 "unmap": true, 00:23:02.469 "flush": true, 00:23:02.469 "reset": true, 00:23:02.469 "nvme_admin": false, 00:23:02.469 "nvme_io": false, 00:23:02.469 "nvme_io_md": false, 00:23:02.469 "write_zeroes": true, 00:23:02.469 "zcopy": true, 00:23:02.469 "get_zone_info": false, 00:23:02.469 "zone_management": false, 00:23:02.469 "zone_append": false, 00:23:02.469 "compare": false, 00:23:02.469 "compare_and_write": false, 00:23:02.469 "abort": true, 00:23:02.469 "seek_hole": false, 00:23:02.469 "seek_data": false, 00:23:02.469 "copy": true, 00:23:02.469 "nvme_iov_md": false 00:23:02.469 }, 00:23:02.469 "memory_domains": [ 00:23:02.469 { 00:23:02.469 "dma_device_id": "system", 00:23:02.469 "dma_device_type": 1 00:23:02.469 }, 00:23:02.469 { 00:23:02.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.469 "dma_device_type": 2 00:23:02.469 } 00:23:02.469 ], 00:23:02.469 "driver_specific": {} 00:23:02.469 } 00:23:02.469 ] 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.469 17:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:02.727 17:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.727 "name": "Existed_Raid", 00:23:02.727 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:23:02.727 "strip_size_kb": 64, 00:23:02.727 "state": "configuring", 00:23:02.727 "raid_level": "raid0", 00:23:02.727 "superblock": true, 00:23:02.727 "num_base_bdevs": 4, 00:23:02.727 "num_base_bdevs_discovered": 3, 00:23:02.727 "num_base_bdevs_operational": 4, 00:23:02.727 "base_bdevs_list": [ 00:23:02.727 { 00:23:02.727 "name": "BaseBdev1", 00:23:02.727 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:23:02.727 "is_configured": true, 00:23:02.727 "data_offset": 2048, 00:23:02.727 "data_size": 63488 00:23:02.727 }, 00:23:02.727 { 00:23:02.727 "name": "BaseBdev2", 00:23:02.727 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:02.727 "is_configured": true, 00:23:02.727 "data_offset": 2048, 00:23:02.727 "data_size": 63488 00:23:02.727 }, 00:23:02.727 { 00:23:02.727 "name": "BaseBdev3", 00:23:02.727 "uuid": "d0a50fd3-2110-41b6-9f86-3785140ea1b6", 00:23:02.727 "is_configured": true, 00:23:02.727 "data_offset": 2048, 00:23:02.727 "data_size": 63488 00:23:02.727 }, 00:23:02.727 { 00:23:02.727 "name": "BaseBdev4", 00:23:02.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.727 "is_configured": false, 00:23:02.727 "data_offset": 0, 00:23:02.727 "data_size": 0 00:23:02.727 } 00:23:02.727 ] 00:23:02.727 }' 00:23:02.727 17:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.727 17:16:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:03.662 17:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:03.920 [2024-07-23 17:16:59.117513] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:03.920 [2024-07-23 17:16:59.117676] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x7ab9b0 00:23:03.920 [2024-07-23 17:16:59.117689] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:03.920 [2024-07-23 17:16:59.117862] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x85b990 00:23:03.920 [2024-07-23 17:16:59.117990] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7ab9b0 00:23:03.920 [2024-07-23 17:16:59.118001] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7ab9b0 00:23:03.920 [2024-07-23 17:16:59.118093] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:03.920 BaseBdev4 00:23:03.920 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:23:03.920 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:03.920 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:03.920 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:03.920 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:03.921 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:03.921 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:04.179 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:04.179 [ 00:23:04.179 { 00:23:04.179 "name": "BaseBdev4", 00:23:04.179 "aliases": [ 00:23:04.179 "2800a82b-7292-479f-b31c-72e9ea7ea9be" 00:23:04.179 ], 00:23:04.179 "product_name": "Malloc disk", 00:23:04.180 "block_size": 512, 00:23:04.180 "num_blocks": 65536, 00:23:04.180 "uuid": "2800a82b-7292-479f-b31c-72e9ea7ea9be", 00:23:04.180 "assigned_rate_limits": { 00:23:04.180 "rw_ios_per_sec": 0, 00:23:04.180 "rw_mbytes_per_sec": 0, 00:23:04.180 "r_mbytes_per_sec": 0, 00:23:04.180 "w_mbytes_per_sec": 0 00:23:04.180 }, 00:23:04.180 "claimed": true, 00:23:04.180 "claim_type": "exclusive_write", 00:23:04.180 "zoned": false, 00:23:04.180 "supported_io_types": { 00:23:04.180 "read": true, 00:23:04.180 "write": true, 00:23:04.180 "unmap": true, 00:23:04.180 "flush": true, 00:23:04.180 "reset": true, 00:23:04.180 "nvme_admin": false, 00:23:04.180 "nvme_io": false, 00:23:04.180 "nvme_io_md": false, 00:23:04.180 "write_zeroes": true, 00:23:04.180 "zcopy": true, 00:23:04.180 "get_zone_info": false, 00:23:04.180 "zone_management": false, 00:23:04.180 "zone_append": false, 00:23:04.180 "compare": false, 00:23:04.180 "compare_and_write": false, 00:23:04.180 "abort": true, 00:23:04.180 "seek_hole": false, 00:23:04.180 "seek_data": false, 00:23:04.180 "copy": true, 00:23:04.180 "nvme_iov_md": false 00:23:04.180 }, 00:23:04.180 "memory_domains": [ 00:23:04.180 { 00:23:04.180 "dma_device_id": "system", 00:23:04.180 "dma_device_type": 1 00:23:04.180 }, 00:23:04.180 { 00:23:04.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:04.180 "dma_device_type": 2 00:23:04.180 } 00:23:04.180 ], 00:23:04.180 "driver_specific": {} 00:23:04.180 } 00:23:04.180 ] 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.180 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:04.439 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.439 "name": "Existed_Raid", 00:23:04.439 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:23:04.439 "strip_size_kb": 64, 00:23:04.439 "state": "online", 00:23:04.439 "raid_level": "raid0", 00:23:04.439 "superblock": true, 00:23:04.439 "num_base_bdevs": 4, 00:23:04.439 "num_base_bdevs_discovered": 4, 00:23:04.439 "num_base_bdevs_operational": 4, 00:23:04.439 "base_bdevs_list": [ 00:23:04.439 { 00:23:04.439 "name": "BaseBdev1", 00:23:04.439 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:23:04.439 "is_configured": true, 00:23:04.439 "data_offset": 2048, 00:23:04.439 "data_size": 63488 00:23:04.439 }, 00:23:04.439 { 00:23:04.439 "name": "BaseBdev2", 00:23:04.439 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:04.439 "is_configured": true, 00:23:04.439 "data_offset": 2048, 00:23:04.439 "data_size": 63488 00:23:04.439 }, 00:23:04.439 { 00:23:04.439 "name": "BaseBdev3", 00:23:04.439 "uuid": "d0a50fd3-2110-41b6-9f86-3785140ea1b6", 00:23:04.439 "is_configured": true, 00:23:04.439 "data_offset": 2048, 00:23:04.439 "data_size": 63488 00:23:04.439 }, 00:23:04.439 { 00:23:04.439 "name": "BaseBdev4", 00:23:04.439 "uuid": "2800a82b-7292-479f-b31c-72e9ea7ea9be", 00:23:04.439 "is_configured": true, 00:23:04.439 "data_offset": 2048, 00:23:04.439 "data_size": 63488 00:23:04.439 } 00:23:04.439 ] 00:23:04.439 }' 00:23:04.439 17:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.439 17:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:05.373 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:05.632 [2024-07-23 17:17:00.834385] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:05.632 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:05.632 "name": "Existed_Raid", 00:23:05.632 "aliases": [ 00:23:05.632 "546ce774-2d6c-4aa4-a09b-f40d7f677722" 00:23:05.632 ], 00:23:05.632 "product_name": "Raid Volume", 00:23:05.632 "block_size": 512, 00:23:05.632 "num_blocks": 253952, 00:23:05.632 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:23:05.632 "assigned_rate_limits": { 00:23:05.632 "rw_ios_per_sec": 0, 00:23:05.632 "rw_mbytes_per_sec": 0, 00:23:05.632 "r_mbytes_per_sec": 0, 00:23:05.632 "w_mbytes_per_sec": 0 00:23:05.632 }, 00:23:05.632 "claimed": false, 00:23:05.632 "zoned": false, 00:23:05.632 "supported_io_types": { 00:23:05.632 "read": true, 00:23:05.632 "write": true, 00:23:05.632 "unmap": true, 00:23:05.632 "flush": true, 00:23:05.632 "reset": true, 00:23:05.632 "nvme_admin": false, 00:23:05.632 "nvme_io": false, 00:23:05.632 "nvme_io_md": false, 00:23:05.632 "write_zeroes": true, 00:23:05.632 "zcopy": false, 00:23:05.632 "get_zone_info": false, 00:23:05.632 "zone_management": false, 00:23:05.632 "zone_append": false, 00:23:05.632 "compare": false, 00:23:05.632 "compare_and_write": false, 00:23:05.632 "abort": false, 00:23:05.632 "seek_hole": false, 00:23:05.632 "seek_data": false, 00:23:05.632 "copy": false, 00:23:05.632 "nvme_iov_md": false 00:23:05.632 }, 00:23:05.632 "memory_domains": [ 00:23:05.632 { 00:23:05.632 "dma_device_id": "system", 00:23:05.632 "dma_device_type": 1 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.632 "dma_device_type": 2 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "system", 00:23:05.632 "dma_device_type": 1 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.632 "dma_device_type": 2 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "system", 00:23:05.632 "dma_device_type": 1 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.632 "dma_device_type": 2 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "system", 00:23:05.632 "dma_device_type": 1 00:23:05.632 }, 00:23:05.632 { 00:23:05.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.632 "dma_device_type": 2 00:23:05.632 } 00:23:05.632 ], 00:23:05.632 "driver_specific": { 00:23:05.632 "raid": { 00:23:05.633 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:23:05.633 "strip_size_kb": 64, 00:23:05.633 "state": "online", 00:23:05.633 "raid_level": "raid0", 00:23:05.633 "superblock": true, 00:23:05.633 "num_base_bdevs": 4, 00:23:05.633 "num_base_bdevs_discovered": 4, 00:23:05.633 "num_base_bdevs_operational": 4, 00:23:05.633 "base_bdevs_list": [ 00:23:05.633 { 00:23:05.633 "name": "BaseBdev1", 00:23:05.633 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:23:05.633 "is_configured": true, 00:23:05.633 "data_offset": 2048, 00:23:05.633 "data_size": 63488 00:23:05.633 }, 00:23:05.633 { 00:23:05.633 "name": "BaseBdev2", 00:23:05.633 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:05.633 "is_configured": true, 00:23:05.633 "data_offset": 2048, 00:23:05.633 "data_size": 63488 00:23:05.633 }, 00:23:05.633 { 00:23:05.633 "name": "BaseBdev3", 00:23:05.633 "uuid": "d0a50fd3-2110-41b6-9f86-3785140ea1b6", 00:23:05.633 "is_configured": true, 00:23:05.633 "data_offset": 2048, 00:23:05.633 "data_size": 63488 00:23:05.633 }, 00:23:05.633 { 00:23:05.633 "name": "BaseBdev4", 00:23:05.633 "uuid": "2800a82b-7292-479f-b31c-72e9ea7ea9be", 00:23:05.633 "is_configured": true, 00:23:05.633 "data_offset": 2048, 00:23:05.633 "data_size": 63488 00:23:05.633 } 00:23:05.633 ] 00:23:05.633 } 00:23:05.633 } 00:23:05.633 }' 00:23:05.633 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:05.633 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:05.633 BaseBdev2 00:23:05.633 BaseBdev3 00:23:05.633 BaseBdev4' 00:23:05.633 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:05.633 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:05.633 17:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:05.892 "name": "BaseBdev1", 00:23:05.892 "aliases": [ 00:23:05.892 "01755f35-7535-4eaf-b568-9ffbad354041" 00:23:05.892 ], 00:23:05.892 "product_name": "Malloc disk", 00:23:05.892 "block_size": 512, 00:23:05.892 "num_blocks": 65536, 00:23:05.892 "uuid": "01755f35-7535-4eaf-b568-9ffbad354041", 00:23:05.892 "assigned_rate_limits": { 00:23:05.892 "rw_ios_per_sec": 0, 00:23:05.892 "rw_mbytes_per_sec": 0, 00:23:05.892 "r_mbytes_per_sec": 0, 00:23:05.892 "w_mbytes_per_sec": 0 00:23:05.892 }, 00:23:05.892 "claimed": true, 00:23:05.892 "claim_type": "exclusive_write", 00:23:05.892 "zoned": false, 00:23:05.892 "supported_io_types": { 00:23:05.892 "read": true, 00:23:05.892 "write": true, 00:23:05.892 "unmap": true, 00:23:05.892 "flush": true, 00:23:05.892 "reset": true, 00:23:05.892 "nvme_admin": false, 00:23:05.892 "nvme_io": false, 00:23:05.892 "nvme_io_md": false, 00:23:05.892 "write_zeroes": true, 00:23:05.892 "zcopy": true, 00:23:05.892 "get_zone_info": false, 00:23:05.892 "zone_management": false, 00:23:05.892 "zone_append": false, 00:23:05.892 "compare": false, 00:23:05.892 "compare_and_write": false, 00:23:05.892 "abort": true, 00:23:05.892 "seek_hole": false, 00:23:05.892 "seek_data": false, 00:23:05.892 "copy": true, 00:23:05.892 "nvme_iov_md": false 00:23:05.892 }, 00:23:05.892 "memory_domains": [ 00:23:05.892 { 00:23:05.892 "dma_device_id": "system", 00:23:05.892 "dma_device_type": 1 00:23:05.892 }, 00:23:05.892 { 00:23:05.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.892 "dma_device_type": 2 00:23:05.892 } 00:23:05.892 ], 00:23:05.892 "driver_specific": {} 00:23:05.892 }' 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:05.892 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:06.150 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:06.409 "name": "BaseBdev2", 00:23:06.409 "aliases": [ 00:23:06.409 "9ee8f984-eb35-4583-8430-83a6173f225b" 00:23:06.409 ], 00:23:06.409 "product_name": "Malloc disk", 00:23:06.409 "block_size": 512, 00:23:06.409 "num_blocks": 65536, 00:23:06.409 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:06.409 "assigned_rate_limits": { 00:23:06.409 "rw_ios_per_sec": 0, 00:23:06.409 "rw_mbytes_per_sec": 0, 00:23:06.409 "r_mbytes_per_sec": 0, 00:23:06.409 "w_mbytes_per_sec": 0 00:23:06.409 }, 00:23:06.409 "claimed": true, 00:23:06.409 "claim_type": "exclusive_write", 00:23:06.409 "zoned": false, 00:23:06.409 "supported_io_types": { 00:23:06.409 "read": true, 00:23:06.409 "write": true, 00:23:06.409 "unmap": true, 00:23:06.409 "flush": true, 00:23:06.409 "reset": true, 00:23:06.409 "nvme_admin": false, 00:23:06.409 "nvme_io": false, 00:23:06.409 "nvme_io_md": false, 00:23:06.409 "write_zeroes": true, 00:23:06.409 "zcopy": true, 00:23:06.409 "get_zone_info": false, 00:23:06.409 "zone_management": false, 00:23:06.409 "zone_append": false, 00:23:06.409 "compare": false, 00:23:06.409 "compare_and_write": false, 00:23:06.409 "abort": true, 00:23:06.409 "seek_hole": false, 00:23:06.409 "seek_data": false, 00:23:06.409 "copy": true, 00:23:06.409 "nvme_iov_md": false 00:23:06.409 }, 00:23:06.409 "memory_domains": [ 00:23:06.409 { 00:23:06.409 "dma_device_id": "system", 00:23:06.409 "dma_device_type": 1 00:23:06.409 }, 00:23:06.409 { 00:23:06.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.409 "dma_device_type": 2 00:23:06.409 } 00:23:06.409 ], 00:23:06.409 "driver_specific": {} 00:23:06.409 }' 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:06.409 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:06.667 17:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:06.926 "name": "BaseBdev3", 00:23:06.926 "aliases": [ 00:23:06.926 "d0a50fd3-2110-41b6-9f86-3785140ea1b6" 00:23:06.926 ], 00:23:06.926 "product_name": "Malloc disk", 00:23:06.926 "block_size": 512, 00:23:06.926 "num_blocks": 65536, 00:23:06.926 "uuid": "d0a50fd3-2110-41b6-9f86-3785140ea1b6", 00:23:06.926 "assigned_rate_limits": { 00:23:06.926 "rw_ios_per_sec": 0, 00:23:06.926 "rw_mbytes_per_sec": 0, 00:23:06.926 "r_mbytes_per_sec": 0, 00:23:06.926 "w_mbytes_per_sec": 0 00:23:06.926 }, 00:23:06.926 "claimed": true, 00:23:06.926 "claim_type": "exclusive_write", 00:23:06.926 "zoned": false, 00:23:06.926 "supported_io_types": { 00:23:06.926 "read": true, 00:23:06.926 "write": true, 00:23:06.926 "unmap": true, 00:23:06.926 "flush": true, 00:23:06.926 "reset": true, 00:23:06.926 "nvme_admin": false, 00:23:06.926 "nvme_io": false, 00:23:06.926 "nvme_io_md": false, 00:23:06.926 "write_zeroes": true, 00:23:06.926 "zcopy": true, 00:23:06.926 "get_zone_info": false, 00:23:06.926 "zone_management": false, 00:23:06.926 "zone_append": false, 00:23:06.926 "compare": false, 00:23:06.926 "compare_and_write": false, 00:23:06.926 "abort": true, 00:23:06.926 "seek_hole": false, 00:23:06.926 "seek_data": false, 00:23:06.926 "copy": true, 00:23:06.926 "nvme_iov_md": false 00:23:06.926 }, 00:23:06.926 "memory_domains": [ 00:23:06.926 { 00:23:06.926 "dma_device_id": "system", 00:23:06.926 "dma_device_type": 1 00:23:06.926 }, 00:23:06.926 { 00:23:06.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.926 "dma_device_type": 2 00:23:06.926 } 00:23:06.926 ], 00:23:06.926 "driver_specific": {} 00:23:06.926 }' 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:06.926 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:07.184 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:07.442 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:07.442 "name": "BaseBdev4", 00:23:07.442 "aliases": [ 00:23:07.442 "2800a82b-7292-479f-b31c-72e9ea7ea9be" 00:23:07.442 ], 00:23:07.442 "product_name": "Malloc disk", 00:23:07.442 "block_size": 512, 00:23:07.442 "num_blocks": 65536, 00:23:07.442 "uuid": "2800a82b-7292-479f-b31c-72e9ea7ea9be", 00:23:07.442 "assigned_rate_limits": { 00:23:07.442 "rw_ios_per_sec": 0, 00:23:07.442 "rw_mbytes_per_sec": 0, 00:23:07.442 "r_mbytes_per_sec": 0, 00:23:07.442 "w_mbytes_per_sec": 0 00:23:07.442 }, 00:23:07.442 "claimed": true, 00:23:07.442 "claim_type": "exclusive_write", 00:23:07.442 "zoned": false, 00:23:07.442 "supported_io_types": { 00:23:07.442 "read": true, 00:23:07.442 "write": true, 00:23:07.442 "unmap": true, 00:23:07.442 "flush": true, 00:23:07.442 "reset": true, 00:23:07.442 "nvme_admin": false, 00:23:07.442 "nvme_io": false, 00:23:07.442 "nvme_io_md": false, 00:23:07.442 "write_zeroes": true, 00:23:07.442 "zcopy": true, 00:23:07.442 "get_zone_info": false, 00:23:07.442 "zone_management": false, 00:23:07.442 "zone_append": false, 00:23:07.442 "compare": false, 00:23:07.442 "compare_and_write": false, 00:23:07.442 "abort": true, 00:23:07.442 "seek_hole": false, 00:23:07.442 "seek_data": false, 00:23:07.442 "copy": true, 00:23:07.442 "nvme_iov_md": false 00:23:07.442 }, 00:23:07.442 "memory_domains": [ 00:23:07.442 { 00:23:07.442 "dma_device_id": "system", 00:23:07.442 "dma_device_type": 1 00:23:07.442 }, 00:23:07.442 { 00:23:07.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.442 "dma_device_type": 2 00:23:07.442 } 00:23:07.442 ], 00:23:07.442 "driver_specific": {} 00:23:07.442 }' 00:23:07.442 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:07.442 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:07.442 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:07.443 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.443 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.443 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:07.443 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.443 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.701 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:07.701 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.701 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.701 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:07.701 17:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:07.960 [2024-07-23 17:17:03.204401] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:07.960 [2024-07-23 17:17:03.204427] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:07.960 [2024-07-23 17:17:03.204476] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.961 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:08.219 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.219 "name": "Existed_Raid", 00:23:08.219 "uuid": "546ce774-2d6c-4aa4-a09b-f40d7f677722", 00:23:08.219 "strip_size_kb": 64, 00:23:08.219 "state": "offline", 00:23:08.219 "raid_level": "raid0", 00:23:08.219 "superblock": true, 00:23:08.219 "num_base_bdevs": 4, 00:23:08.219 "num_base_bdevs_discovered": 3, 00:23:08.219 "num_base_bdevs_operational": 3, 00:23:08.219 "base_bdevs_list": [ 00:23:08.219 { 00:23:08.219 "name": null, 00:23:08.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.219 "is_configured": false, 00:23:08.219 "data_offset": 2048, 00:23:08.219 "data_size": 63488 00:23:08.219 }, 00:23:08.219 { 00:23:08.219 "name": "BaseBdev2", 00:23:08.219 "uuid": "9ee8f984-eb35-4583-8430-83a6173f225b", 00:23:08.219 "is_configured": true, 00:23:08.219 "data_offset": 2048, 00:23:08.219 "data_size": 63488 00:23:08.219 }, 00:23:08.219 { 00:23:08.219 "name": "BaseBdev3", 00:23:08.219 "uuid": "d0a50fd3-2110-41b6-9f86-3785140ea1b6", 00:23:08.219 "is_configured": true, 00:23:08.219 "data_offset": 2048, 00:23:08.219 "data_size": 63488 00:23:08.219 }, 00:23:08.219 { 00:23:08.219 "name": "BaseBdev4", 00:23:08.219 "uuid": "2800a82b-7292-479f-b31c-72e9ea7ea9be", 00:23:08.219 "is_configured": true, 00:23:08.219 "data_offset": 2048, 00:23:08.219 "data_size": 63488 00:23:08.219 } 00:23:08.219 ] 00:23:08.219 }' 00:23:08.219 17:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.219 17:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:08.785 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:08.785 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:08.785 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:08.785 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.043 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:09.043 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:09.043 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:09.302 [2024-07-23 17:17:04.565912] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:09.302 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:09.302 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:09.302 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.302 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:09.560 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:09.560 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:09.560 17:17:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:23:09.819 [2024-07-23 17:17:05.083696] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:09.819 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:09.819 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:09.819 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.819 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:10.077 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:10.077 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:10.077 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:23:10.336 [2024-07-23 17:17:05.601444] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:23:10.336 [2024-07-23 17:17:05.601483] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7ab9b0 name Existed_Raid, state offline 00:23:10.336 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:10.336 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:10.336 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.336 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:10.595 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:10.595 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:10.595 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:23:10.595 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:23:10.595 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:10.595 17:17:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:10.854 BaseBdev2 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:10.854 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:11.114 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:11.373 [ 00:23:11.373 { 00:23:11.373 "name": "BaseBdev2", 00:23:11.373 "aliases": [ 00:23:11.373 "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b" 00:23:11.373 ], 00:23:11.373 "product_name": "Malloc disk", 00:23:11.373 "block_size": 512, 00:23:11.373 "num_blocks": 65536, 00:23:11.373 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:11.373 "assigned_rate_limits": { 00:23:11.373 "rw_ios_per_sec": 0, 00:23:11.373 "rw_mbytes_per_sec": 0, 00:23:11.373 "r_mbytes_per_sec": 0, 00:23:11.373 "w_mbytes_per_sec": 0 00:23:11.373 }, 00:23:11.373 "claimed": false, 00:23:11.373 "zoned": false, 00:23:11.373 "supported_io_types": { 00:23:11.373 "read": true, 00:23:11.373 "write": true, 00:23:11.373 "unmap": true, 00:23:11.373 "flush": true, 00:23:11.373 "reset": true, 00:23:11.373 "nvme_admin": false, 00:23:11.373 "nvme_io": false, 00:23:11.373 "nvme_io_md": false, 00:23:11.373 "write_zeroes": true, 00:23:11.373 "zcopy": true, 00:23:11.373 "get_zone_info": false, 00:23:11.373 "zone_management": false, 00:23:11.373 "zone_append": false, 00:23:11.373 "compare": false, 00:23:11.373 "compare_and_write": false, 00:23:11.373 "abort": true, 00:23:11.373 "seek_hole": false, 00:23:11.373 "seek_data": false, 00:23:11.373 "copy": true, 00:23:11.373 "nvme_iov_md": false 00:23:11.373 }, 00:23:11.373 "memory_domains": [ 00:23:11.373 { 00:23:11.373 "dma_device_id": "system", 00:23:11.373 "dma_device_type": 1 00:23:11.373 }, 00:23:11.373 { 00:23:11.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.373 "dma_device_type": 2 00:23:11.373 } 00:23:11.373 ], 00:23:11.373 "driver_specific": {} 00:23:11.373 } 00:23:11.373 ] 00:23:11.373 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:11.373 17:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:11.373 17:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:11.373 17:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:11.697 BaseBdev3 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:11.697 17:17:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:11.956 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:11.956 [ 00:23:11.956 { 00:23:11.956 "name": "BaseBdev3", 00:23:11.956 "aliases": [ 00:23:11.956 "6bfae68f-199c-4fb2-8e2e-444853a927fd" 00:23:11.956 ], 00:23:11.956 "product_name": "Malloc disk", 00:23:11.956 "block_size": 512, 00:23:11.956 "num_blocks": 65536, 00:23:11.956 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:11.956 "assigned_rate_limits": { 00:23:11.956 "rw_ios_per_sec": 0, 00:23:11.956 "rw_mbytes_per_sec": 0, 00:23:11.956 "r_mbytes_per_sec": 0, 00:23:11.956 "w_mbytes_per_sec": 0 00:23:11.956 }, 00:23:11.956 "claimed": false, 00:23:11.956 "zoned": false, 00:23:11.956 "supported_io_types": { 00:23:11.956 "read": true, 00:23:11.956 "write": true, 00:23:11.956 "unmap": true, 00:23:11.956 "flush": true, 00:23:11.956 "reset": true, 00:23:11.956 "nvme_admin": false, 00:23:11.956 "nvme_io": false, 00:23:11.956 "nvme_io_md": false, 00:23:11.956 "write_zeroes": true, 00:23:11.956 "zcopy": true, 00:23:11.956 "get_zone_info": false, 00:23:11.956 "zone_management": false, 00:23:11.956 "zone_append": false, 00:23:11.956 "compare": false, 00:23:11.956 "compare_and_write": false, 00:23:11.956 "abort": true, 00:23:11.956 "seek_hole": false, 00:23:11.956 "seek_data": false, 00:23:11.956 "copy": true, 00:23:11.956 "nvme_iov_md": false 00:23:11.956 }, 00:23:11.956 "memory_domains": [ 00:23:11.956 { 00:23:11.956 "dma_device_id": "system", 00:23:11.956 "dma_device_type": 1 00:23:11.956 }, 00:23:11.956 { 00:23:11.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.956 "dma_device_type": 2 00:23:11.956 } 00:23:11.956 ], 00:23:11.956 "driver_specific": {} 00:23:11.956 } 00:23:11.956 ] 00:23:11.956 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:11.956 17:17:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:11.956 17:17:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:11.956 17:17:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:12.215 BaseBdev4 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:12.215 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:12.474 17:17:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:12.733 [ 00:23:12.733 { 00:23:12.733 "name": "BaseBdev4", 00:23:12.733 "aliases": [ 00:23:12.733 "9fe0a1f9-4e07-4705-8932-e75242501730" 00:23:12.733 ], 00:23:12.733 "product_name": "Malloc disk", 00:23:12.733 "block_size": 512, 00:23:12.733 "num_blocks": 65536, 00:23:12.733 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:12.733 "assigned_rate_limits": { 00:23:12.733 "rw_ios_per_sec": 0, 00:23:12.733 "rw_mbytes_per_sec": 0, 00:23:12.733 "r_mbytes_per_sec": 0, 00:23:12.733 "w_mbytes_per_sec": 0 00:23:12.733 }, 00:23:12.733 "claimed": false, 00:23:12.733 "zoned": false, 00:23:12.733 "supported_io_types": { 00:23:12.733 "read": true, 00:23:12.733 "write": true, 00:23:12.733 "unmap": true, 00:23:12.733 "flush": true, 00:23:12.733 "reset": true, 00:23:12.733 "nvme_admin": false, 00:23:12.733 "nvme_io": false, 00:23:12.733 "nvme_io_md": false, 00:23:12.733 "write_zeroes": true, 00:23:12.733 "zcopy": true, 00:23:12.733 "get_zone_info": false, 00:23:12.733 "zone_management": false, 00:23:12.733 "zone_append": false, 00:23:12.733 "compare": false, 00:23:12.733 "compare_and_write": false, 00:23:12.733 "abort": true, 00:23:12.733 "seek_hole": false, 00:23:12.733 "seek_data": false, 00:23:12.733 "copy": true, 00:23:12.733 "nvme_iov_md": false 00:23:12.733 }, 00:23:12.733 "memory_domains": [ 00:23:12.733 { 00:23:12.733 "dma_device_id": "system", 00:23:12.733 "dma_device_type": 1 00:23:12.733 }, 00:23:12.733 { 00:23:12.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:12.733 "dma_device_type": 2 00:23:12.733 } 00:23:12.733 ], 00:23:12.733 "driver_specific": {} 00:23:12.733 } 00:23:12.733 ] 00:23:12.733 17:17:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:12.733 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:12.733 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:12.733 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:12.993 [2024-07-23 17:17:08.336929] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:12.993 [2024-07-23 17:17:08.336971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:12.993 [2024-07-23 17:17:08.336989] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:12.993 [2024-07-23 17:17:08.338373] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:12.993 [2024-07-23 17:17:08.338414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.993 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:13.252 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.252 "name": "Existed_Raid", 00:23:13.252 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:13.252 "strip_size_kb": 64, 00:23:13.252 "state": "configuring", 00:23:13.252 "raid_level": "raid0", 00:23:13.252 "superblock": true, 00:23:13.252 "num_base_bdevs": 4, 00:23:13.252 "num_base_bdevs_discovered": 3, 00:23:13.252 "num_base_bdevs_operational": 4, 00:23:13.252 "base_bdevs_list": [ 00:23:13.252 { 00:23:13.252 "name": "BaseBdev1", 00:23:13.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.252 "is_configured": false, 00:23:13.252 "data_offset": 0, 00:23:13.252 "data_size": 0 00:23:13.252 }, 00:23:13.252 { 00:23:13.252 "name": "BaseBdev2", 00:23:13.252 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:13.252 "is_configured": true, 00:23:13.252 "data_offset": 2048, 00:23:13.252 "data_size": 63488 00:23:13.252 }, 00:23:13.252 { 00:23:13.252 "name": "BaseBdev3", 00:23:13.252 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:13.253 "is_configured": true, 00:23:13.253 "data_offset": 2048, 00:23:13.253 "data_size": 63488 00:23:13.253 }, 00:23:13.253 { 00:23:13.253 "name": "BaseBdev4", 00:23:13.253 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:13.253 "is_configured": true, 00:23:13.253 "data_offset": 2048, 00:23:13.253 "data_size": 63488 00:23:13.253 } 00:23:13.253 ] 00:23:13.253 }' 00:23:13.253 17:17:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.253 17:17:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:13.821 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:14.105 [2024-07-23 17:17:09.419743] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.105 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:14.364 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.364 "name": "Existed_Raid", 00:23:14.364 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:14.364 "strip_size_kb": 64, 00:23:14.364 "state": "configuring", 00:23:14.364 "raid_level": "raid0", 00:23:14.364 "superblock": true, 00:23:14.364 "num_base_bdevs": 4, 00:23:14.364 "num_base_bdevs_discovered": 2, 00:23:14.364 "num_base_bdevs_operational": 4, 00:23:14.364 "base_bdevs_list": [ 00:23:14.364 { 00:23:14.364 "name": "BaseBdev1", 00:23:14.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.364 "is_configured": false, 00:23:14.364 "data_offset": 0, 00:23:14.364 "data_size": 0 00:23:14.364 }, 00:23:14.364 { 00:23:14.364 "name": null, 00:23:14.364 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:14.364 "is_configured": false, 00:23:14.364 "data_offset": 2048, 00:23:14.364 "data_size": 63488 00:23:14.364 }, 00:23:14.364 { 00:23:14.364 "name": "BaseBdev3", 00:23:14.364 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:14.364 "is_configured": true, 00:23:14.364 "data_offset": 2048, 00:23:14.364 "data_size": 63488 00:23:14.364 }, 00:23:14.364 { 00:23:14.364 "name": "BaseBdev4", 00:23:14.364 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:14.364 "is_configured": true, 00:23:14.364 "data_offset": 2048, 00:23:14.364 "data_size": 63488 00:23:14.364 } 00:23:14.364 ] 00:23:14.364 }' 00:23:14.364 17:17:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.364 17:17:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:14.931 17:17:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:14.931 17:17:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.190 17:17:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:23:15.190 17:17:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:15.449 [2024-07-23 17:17:10.770602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:15.449 BaseBdev1 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:15.449 17:17:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:15.708 17:17:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:15.967 [ 00:23:15.967 { 00:23:15.967 "name": "BaseBdev1", 00:23:15.967 "aliases": [ 00:23:15.967 "2c7e8daf-a727-4aeb-8df0-133858614231" 00:23:15.967 ], 00:23:15.967 "product_name": "Malloc disk", 00:23:15.967 "block_size": 512, 00:23:15.967 "num_blocks": 65536, 00:23:15.967 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:15.967 "assigned_rate_limits": { 00:23:15.967 "rw_ios_per_sec": 0, 00:23:15.967 "rw_mbytes_per_sec": 0, 00:23:15.967 "r_mbytes_per_sec": 0, 00:23:15.967 "w_mbytes_per_sec": 0 00:23:15.967 }, 00:23:15.967 "claimed": true, 00:23:15.967 "claim_type": "exclusive_write", 00:23:15.967 "zoned": false, 00:23:15.967 "supported_io_types": { 00:23:15.967 "read": true, 00:23:15.967 "write": true, 00:23:15.967 "unmap": true, 00:23:15.967 "flush": true, 00:23:15.967 "reset": true, 00:23:15.967 "nvme_admin": false, 00:23:15.967 "nvme_io": false, 00:23:15.967 "nvme_io_md": false, 00:23:15.967 "write_zeroes": true, 00:23:15.967 "zcopy": true, 00:23:15.967 "get_zone_info": false, 00:23:15.967 "zone_management": false, 00:23:15.967 "zone_append": false, 00:23:15.967 "compare": false, 00:23:15.967 "compare_and_write": false, 00:23:15.967 "abort": true, 00:23:15.967 "seek_hole": false, 00:23:15.967 "seek_data": false, 00:23:15.967 "copy": true, 00:23:15.967 "nvme_iov_md": false 00:23:15.967 }, 00:23:15.967 "memory_domains": [ 00:23:15.967 { 00:23:15.967 "dma_device_id": "system", 00:23:15.967 "dma_device_type": 1 00:23:15.967 }, 00:23:15.967 { 00:23:15.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:15.967 "dma_device_type": 2 00:23:15.967 } 00:23:15.967 ], 00:23:15.967 "driver_specific": {} 00:23:15.967 } 00:23:15.967 ] 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.967 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:16.225 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.225 "name": "Existed_Raid", 00:23:16.225 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:16.225 "strip_size_kb": 64, 00:23:16.225 "state": "configuring", 00:23:16.225 "raid_level": "raid0", 00:23:16.225 "superblock": true, 00:23:16.225 "num_base_bdevs": 4, 00:23:16.225 "num_base_bdevs_discovered": 3, 00:23:16.225 "num_base_bdevs_operational": 4, 00:23:16.225 "base_bdevs_list": [ 00:23:16.225 { 00:23:16.225 "name": "BaseBdev1", 00:23:16.225 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:16.225 "is_configured": true, 00:23:16.225 "data_offset": 2048, 00:23:16.225 "data_size": 63488 00:23:16.225 }, 00:23:16.226 { 00:23:16.226 "name": null, 00:23:16.226 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:16.226 "is_configured": false, 00:23:16.226 "data_offset": 2048, 00:23:16.226 "data_size": 63488 00:23:16.226 }, 00:23:16.226 { 00:23:16.226 "name": "BaseBdev3", 00:23:16.226 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:16.226 "is_configured": true, 00:23:16.226 "data_offset": 2048, 00:23:16.226 "data_size": 63488 00:23:16.226 }, 00:23:16.226 { 00:23:16.226 "name": "BaseBdev4", 00:23:16.226 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:16.226 "is_configured": true, 00:23:16.226 "data_offset": 2048, 00:23:16.226 "data_size": 63488 00:23:16.226 } 00:23:16.226 ] 00:23:16.226 }' 00:23:16.226 17:17:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.226 17:17:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:16.792 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.792 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:17.052 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:23:17.052 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:23:17.311 [2024-07-23 17:17:12.607717] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.311 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:17.569 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.570 "name": "Existed_Raid", 00:23:17.570 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:17.570 "strip_size_kb": 64, 00:23:17.570 "state": "configuring", 00:23:17.570 "raid_level": "raid0", 00:23:17.570 "superblock": true, 00:23:17.570 "num_base_bdevs": 4, 00:23:17.570 "num_base_bdevs_discovered": 2, 00:23:17.570 "num_base_bdevs_operational": 4, 00:23:17.570 "base_bdevs_list": [ 00:23:17.570 { 00:23:17.570 "name": "BaseBdev1", 00:23:17.570 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:17.570 "is_configured": true, 00:23:17.570 "data_offset": 2048, 00:23:17.570 "data_size": 63488 00:23:17.570 }, 00:23:17.570 { 00:23:17.570 "name": null, 00:23:17.570 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:17.570 "is_configured": false, 00:23:17.570 "data_offset": 2048, 00:23:17.570 "data_size": 63488 00:23:17.570 }, 00:23:17.570 { 00:23:17.570 "name": null, 00:23:17.570 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:17.570 "is_configured": false, 00:23:17.570 "data_offset": 2048, 00:23:17.570 "data_size": 63488 00:23:17.570 }, 00:23:17.570 { 00:23:17.570 "name": "BaseBdev4", 00:23:17.570 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:17.570 "is_configured": true, 00:23:17.570 "data_offset": 2048, 00:23:17.570 "data_size": 63488 00:23:17.570 } 00:23:17.570 ] 00:23:17.570 }' 00:23:17.570 17:17:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.570 17:17:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:18.142 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.142 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:18.400 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:23:18.400 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:23:18.658 [2024-07-23 17:17:13.907173] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.658 17:17:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:18.917 17:17:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.917 "name": "Existed_Raid", 00:23:18.917 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:18.917 "strip_size_kb": 64, 00:23:18.917 "state": "configuring", 00:23:18.917 "raid_level": "raid0", 00:23:18.917 "superblock": true, 00:23:18.917 "num_base_bdevs": 4, 00:23:18.917 "num_base_bdevs_discovered": 3, 00:23:18.917 "num_base_bdevs_operational": 4, 00:23:18.917 "base_bdevs_list": [ 00:23:18.917 { 00:23:18.917 "name": "BaseBdev1", 00:23:18.917 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:18.917 "is_configured": true, 00:23:18.917 "data_offset": 2048, 00:23:18.917 "data_size": 63488 00:23:18.917 }, 00:23:18.917 { 00:23:18.917 "name": null, 00:23:18.917 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:18.917 "is_configured": false, 00:23:18.917 "data_offset": 2048, 00:23:18.917 "data_size": 63488 00:23:18.917 }, 00:23:18.917 { 00:23:18.917 "name": "BaseBdev3", 00:23:18.917 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:18.917 "is_configured": true, 00:23:18.917 "data_offset": 2048, 00:23:18.917 "data_size": 63488 00:23:18.917 }, 00:23:18.917 { 00:23:18.917 "name": "BaseBdev4", 00:23:18.917 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:18.917 "is_configured": true, 00:23:18.917 "data_offset": 2048, 00:23:18.917 "data_size": 63488 00:23:18.917 } 00:23:18.917 ] 00:23:18.917 }' 00:23:18.917 17:17:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.917 17:17:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:19.483 17:17:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.483 17:17:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:19.742 17:17:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:23:19.742 17:17:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:20.001 [2024-07-23 17:17:15.198621] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.001 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:20.260 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:20.260 "name": "Existed_Raid", 00:23:20.260 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:20.260 "strip_size_kb": 64, 00:23:20.260 "state": "configuring", 00:23:20.260 "raid_level": "raid0", 00:23:20.260 "superblock": true, 00:23:20.260 "num_base_bdevs": 4, 00:23:20.260 "num_base_bdevs_discovered": 2, 00:23:20.260 "num_base_bdevs_operational": 4, 00:23:20.260 "base_bdevs_list": [ 00:23:20.260 { 00:23:20.260 "name": null, 00:23:20.260 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:20.260 "is_configured": false, 00:23:20.260 "data_offset": 2048, 00:23:20.260 "data_size": 63488 00:23:20.260 }, 00:23:20.260 { 00:23:20.260 "name": null, 00:23:20.260 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:20.260 "is_configured": false, 00:23:20.260 "data_offset": 2048, 00:23:20.260 "data_size": 63488 00:23:20.260 }, 00:23:20.260 { 00:23:20.260 "name": "BaseBdev3", 00:23:20.260 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:20.260 "is_configured": true, 00:23:20.260 "data_offset": 2048, 00:23:20.260 "data_size": 63488 00:23:20.260 }, 00:23:20.260 { 00:23:20.260 "name": "BaseBdev4", 00:23:20.260 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:20.260 "is_configured": true, 00:23:20.260 "data_offset": 2048, 00:23:20.260 "data_size": 63488 00:23:20.260 } 00:23:20.260 ] 00:23:20.260 }' 00:23:20.260 17:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:20.260 17:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:20.826 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.827 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:21.085 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:23:21.085 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:23:21.344 [2024-07-23 17:17:16.596974] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.344 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:21.602 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.602 "name": "Existed_Raid", 00:23:21.602 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:21.602 "strip_size_kb": 64, 00:23:21.602 "state": "configuring", 00:23:21.602 "raid_level": "raid0", 00:23:21.602 "superblock": true, 00:23:21.602 "num_base_bdevs": 4, 00:23:21.602 "num_base_bdevs_discovered": 3, 00:23:21.603 "num_base_bdevs_operational": 4, 00:23:21.603 "base_bdevs_list": [ 00:23:21.603 { 00:23:21.603 "name": null, 00:23:21.603 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:21.603 "is_configured": false, 00:23:21.603 "data_offset": 2048, 00:23:21.603 "data_size": 63488 00:23:21.603 }, 00:23:21.603 { 00:23:21.603 "name": "BaseBdev2", 00:23:21.603 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:21.603 "is_configured": true, 00:23:21.603 "data_offset": 2048, 00:23:21.603 "data_size": 63488 00:23:21.603 }, 00:23:21.603 { 00:23:21.603 "name": "BaseBdev3", 00:23:21.603 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:21.603 "is_configured": true, 00:23:21.603 "data_offset": 2048, 00:23:21.603 "data_size": 63488 00:23:21.603 }, 00:23:21.603 { 00:23:21.603 "name": "BaseBdev4", 00:23:21.603 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:21.603 "is_configured": true, 00:23:21.603 "data_offset": 2048, 00:23:21.603 "data_size": 63488 00:23:21.603 } 00:23:21.603 ] 00:23:21.603 }' 00:23:21.603 17:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.603 17:17:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:22.170 17:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.170 17:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:22.428 17:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:22.429 17:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.429 17:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:22.688 17:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2c7e8daf-a727-4aeb-8df0-133858614231 00:23:22.688 [2024-07-23 17:17:18.092183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:22.688 [2024-07-23 17:17:18.092337] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x9608f0 00:23:22.688 [2024-07-23 17:17:18.092351] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:22.688 [2024-07-23 17:17:18.092520] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x85d4a0 00:23:22.688 [2024-07-23 17:17:18.092634] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9608f0 00:23:22.688 [2024-07-23 17:17:18.092644] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x9608f0 00:23:22.688 [2024-07-23 17:17:18.092738] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:22.688 NewBaseBdev 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:22.948 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:23.207 [ 00:23:23.207 { 00:23:23.207 "name": "NewBaseBdev", 00:23:23.207 "aliases": [ 00:23:23.207 "2c7e8daf-a727-4aeb-8df0-133858614231" 00:23:23.207 ], 00:23:23.207 "product_name": "Malloc disk", 00:23:23.207 "block_size": 512, 00:23:23.207 "num_blocks": 65536, 00:23:23.207 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:23.207 "assigned_rate_limits": { 00:23:23.207 "rw_ios_per_sec": 0, 00:23:23.207 "rw_mbytes_per_sec": 0, 00:23:23.207 "r_mbytes_per_sec": 0, 00:23:23.207 "w_mbytes_per_sec": 0 00:23:23.207 }, 00:23:23.207 "claimed": true, 00:23:23.207 "claim_type": "exclusive_write", 00:23:23.207 "zoned": false, 00:23:23.207 "supported_io_types": { 00:23:23.207 "read": true, 00:23:23.207 "write": true, 00:23:23.207 "unmap": true, 00:23:23.207 "flush": true, 00:23:23.207 "reset": true, 00:23:23.207 "nvme_admin": false, 00:23:23.207 "nvme_io": false, 00:23:23.207 "nvme_io_md": false, 00:23:23.207 "write_zeroes": true, 00:23:23.207 "zcopy": true, 00:23:23.207 "get_zone_info": false, 00:23:23.207 "zone_management": false, 00:23:23.207 "zone_append": false, 00:23:23.207 "compare": false, 00:23:23.207 "compare_and_write": false, 00:23:23.207 "abort": true, 00:23:23.207 "seek_hole": false, 00:23:23.207 "seek_data": false, 00:23:23.207 "copy": true, 00:23:23.207 "nvme_iov_md": false 00:23:23.207 }, 00:23:23.207 "memory_domains": [ 00:23:23.207 { 00:23:23.207 "dma_device_id": "system", 00:23:23.207 "dma_device_type": 1 00:23:23.207 }, 00:23:23.207 { 00:23:23.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.207 "dma_device_type": 2 00:23:23.207 } 00:23:23.207 ], 00:23:23.207 "driver_specific": {} 00:23:23.207 } 00:23:23.207 ] 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.207 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:23.465 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.465 "name": "Existed_Raid", 00:23:23.465 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:23.465 "strip_size_kb": 64, 00:23:23.465 "state": "online", 00:23:23.465 "raid_level": "raid0", 00:23:23.465 "superblock": true, 00:23:23.465 "num_base_bdevs": 4, 00:23:23.465 "num_base_bdevs_discovered": 4, 00:23:23.465 "num_base_bdevs_operational": 4, 00:23:23.465 "base_bdevs_list": [ 00:23:23.465 { 00:23:23.465 "name": "NewBaseBdev", 00:23:23.465 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:23.465 "is_configured": true, 00:23:23.465 "data_offset": 2048, 00:23:23.465 "data_size": 63488 00:23:23.465 }, 00:23:23.465 { 00:23:23.465 "name": "BaseBdev2", 00:23:23.465 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:23.465 "is_configured": true, 00:23:23.465 "data_offset": 2048, 00:23:23.465 "data_size": 63488 00:23:23.465 }, 00:23:23.465 { 00:23:23.465 "name": "BaseBdev3", 00:23:23.465 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:23.465 "is_configured": true, 00:23:23.465 "data_offset": 2048, 00:23:23.465 "data_size": 63488 00:23:23.465 }, 00:23:23.465 { 00:23:23.466 "name": "BaseBdev4", 00:23:23.466 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:23.466 "is_configured": true, 00:23:23.466 "data_offset": 2048, 00:23:23.466 "data_size": 63488 00:23:23.466 } 00:23:23.466 ] 00:23:23.466 }' 00:23:23.466 17:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.466 17:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:24.402 [2024-07-23 17:17:19.692781] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:24.402 "name": "Existed_Raid", 00:23:24.402 "aliases": [ 00:23:24.402 "6f083bff-e5d0-4882-9800-4f7aa3ae6c42" 00:23:24.402 ], 00:23:24.402 "product_name": "Raid Volume", 00:23:24.402 "block_size": 512, 00:23:24.402 "num_blocks": 253952, 00:23:24.402 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:24.402 "assigned_rate_limits": { 00:23:24.402 "rw_ios_per_sec": 0, 00:23:24.402 "rw_mbytes_per_sec": 0, 00:23:24.402 "r_mbytes_per_sec": 0, 00:23:24.402 "w_mbytes_per_sec": 0 00:23:24.402 }, 00:23:24.402 "claimed": false, 00:23:24.402 "zoned": false, 00:23:24.402 "supported_io_types": { 00:23:24.402 "read": true, 00:23:24.402 "write": true, 00:23:24.402 "unmap": true, 00:23:24.402 "flush": true, 00:23:24.402 "reset": true, 00:23:24.402 "nvme_admin": false, 00:23:24.402 "nvme_io": false, 00:23:24.402 "nvme_io_md": false, 00:23:24.402 "write_zeroes": true, 00:23:24.402 "zcopy": false, 00:23:24.402 "get_zone_info": false, 00:23:24.402 "zone_management": false, 00:23:24.402 "zone_append": false, 00:23:24.402 "compare": false, 00:23:24.402 "compare_and_write": false, 00:23:24.402 "abort": false, 00:23:24.402 "seek_hole": false, 00:23:24.402 "seek_data": false, 00:23:24.402 "copy": false, 00:23:24.402 "nvme_iov_md": false 00:23:24.402 }, 00:23:24.402 "memory_domains": [ 00:23:24.402 { 00:23:24.402 "dma_device_id": "system", 00:23:24.402 "dma_device_type": 1 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.402 "dma_device_type": 2 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "system", 00:23:24.402 "dma_device_type": 1 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.402 "dma_device_type": 2 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "system", 00:23:24.402 "dma_device_type": 1 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.402 "dma_device_type": 2 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "system", 00:23:24.402 "dma_device_type": 1 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.402 "dma_device_type": 2 00:23:24.402 } 00:23:24.402 ], 00:23:24.402 "driver_specific": { 00:23:24.402 "raid": { 00:23:24.402 "uuid": "6f083bff-e5d0-4882-9800-4f7aa3ae6c42", 00:23:24.402 "strip_size_kb": 64, 00:23:24.402 "state": "online", 00:23:24.402 "raid_level": "raid0", 00:23:24.402 "superblock": true, 00:23:24.402 "num_base_bdevs": 4, 00:23:24.402 "num_base_bdevs_discovered": 4, 00:23:24.402 "num_base_bdevs_operational": 4, 00:23:24.402 "base_bdevs_list": [ 00:23:24.402 { 00:23:24.402 "name": "NewBaseBdev", 00:23:24.402 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:24.402 "is_configured": true, 00:23:24.402 "data_offset": 2048, 00:23:24.402 "data_size": 63488 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "name": "BaseBdev2", 00:23:24.402 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:24.402 "is_configured": true, 00:23:24.402 "data_offset": 2048, 00:23:24.402 "data_size": 63488 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "name": "BaseBdev3", 00:23:24.402 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:24.402 "is_configured": true, 00:23:24.402 "data_offset": 2048, 00:23:24.402 "data_size": 63488 00:23:24.402 }, 00:23:24.402 { 00:23:24.402 "name": "BaseBdev4", 00:23:24.402 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:24.402 "is_configured": true, 00:23:24.402 "data_offset": 2048, 00:23:24.402 "data_size": 63488 00:23:24.402 } 00:23:24.402 ] 00:23:24.402 } 00:23:24.402 } 00:23:24.402 }' 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:24.402 BaseBdev2 00:23:24.402 BaseBdev3 00:23:24.402 BaseBdev4' 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:24.402 17:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:24.661 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:24.661 "name": "NewBaseBdev", 00:23:24.661 "aliases": [ 00:23:24.661 "2c7e8daf-a727-4aeb-8df0-133858614231" 00:23:24.661 ], 00:23:24.661 "product_name": "Malloc disk", 00:23:24.661 "block_size": 512, 00:23:24.661 "num_blocks": 65536, 00:23:24.661 "uuid": "2c7e8daf-a727-4aeb-8df0-133858614231", 00:23:24.661 "assigned_rate_limits": { 00:23:24.661 "rw_ios_per_sec": 0, 00:23:24.661 "rw_mbytes_per_sec": 0, 00:23:24.661 "r_mbytes_per_sec": 0, 00:23:24.661 "w_mbytes_per_sec": 0 00:23:24.661 }, 00:23:24.661 "claimed": true, 00:23:24.661 "claim_type": "exclusive_write", 00:23:24.661 "zoned": false, 00:23:24.661 "supported_io_types": { 00:23:24.661 "read": true, 00:23:24.661 "write": true, 00:23:24.661 "unmap": true, 00:23:24.661 "flush": true, 00:23:24.661 "reset": true, 00:23:24.661 "nvme_admin": false, 00:23:24.661 "nvme_io": false, 00:23:24.661 "nvme_io_md": false, 00:23:24.661 "write_zeroes": true, 00:23:24.661 "zcopy": true, 00:23:24.661 "get_zone_info": false, 00:23:24.661 "zone_management": false, 00:23:24.661 "zone_append": false, 00:23:24.661 "compare": false, 00:23:24.661 "compare_and_write": false, 00:23:24.661 "abort": true, 00:23:24.661 "seek_hole": false, 00:23:24.661 "seek_data": false, 00:23:24.661 "copy": true, 00:23:24.661 "nvme_iov_md": false 00:23:24.661 }, 00:23:24.661 "memory_domains": [ 00:23:24.661 { 00:23:24.661 "dma_device_id": "system", 00:23:24.661 "dma_device_type": 1 00:23:24.661 }, 00:23:24.661 { 00:23:24.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.661 "dma_device_type": 2 00:23:24.661 } 00:23:24.661 ], 00:23:24.661 "driver_specific": {} 00:23:24.661 }' 00:23:24.661 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.661 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.919 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.178 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:25.178 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:25.178 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:25.178 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:25.436 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:25.436 "name": "BaseBdev2", 00:23:25.436 "aliases": [ 00:23:25.436 "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b" 00:23:25.436 ], 00:23:25.436 "product_name": "Malloc disk", 00:23:25.436 "block_size": 512, 00:23:25.436 "num_blocks": 65536, 00:23:25.436 "uuid": "33a9bf9c-c701-4bde-9f8b-4f0b45282e8b", 00:23:25.436 "assigned_rate_limits": { 00:23:25.436 "rw_ios_per_sec": 0, 00:23:25.436 "rw_mbytes_per_sec": 0, 00:23:25.436 "r_mbytes_per_sec": 0, 00:23:25.436 "w_mbytes_per_sec": 0 00:23:25.436 }, 00:23:25.436 "claimed": true, 00:23:25.436 "claim_type": "exclusive_write", 00:23:25.436 "zoned": false, 00:23:25.436 "supported_io_types": { 00:23:25.436 "read": true, 00:23:25.436 "write": true, 00:23:25.436 "unmap": true, 00:23:25.436 "flush": true, 00:23:25.436 "reset": true, 00:23:25.436 "nvme_admin": false, 00:23:25.436 "nvme_io": false, 00:23:25.436 "nvme_io_md": false, 00:23:25.436 "write_zeroes": true, 00:23:25.436 "zcopy": true, 00:23:25.437 "get_zone_info": false, 00:23:25.437 "zone_management": false, 00:23:25.437 "zone_append": false, 00:23:25.437 "compare": false, 00:23:25.437 "compare_and_write": false, 00:23:25.437 "abort": true, 00:23:25.437 "seek_hole": false, 00:23:25.437 "seek_data": false, 00:23:25.437 "copy": true, 00:23:25.437 "nvme_iov_md": false 00:23:25.437 }, 00:23:25.437 "memory_domains": [ 00:23:25.437 { 00:23:25.437 "dma_device_id": "system", 00:23:25.437 "dma_device_type": 1 00:23:25.437 }, 00:23:25.437 { 00:23:25.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.437 "dma_device_type": 2 00:23:25.437 } 00:23:25.437 ], 00:23:25.437 "driver_specific": {} 00:23:25.437 }' 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.437 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:25.696 17:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:25.955 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:25.955 "name": "BaseBdev3", 00:23:25.955 "aliases": [ 00:23:25.955 "6bfae68f-199c-4fb2-8e2e-444853a927fd" 00:23:25.955 ], 00:23:25.955 "product_name": "Malloc disk", 00:23:25.955 "block_size": 512, 00:23:25.955 "num_blocks": 65536, 00:23:25.955 "uuid": "6bfae68f-199c-4fb2-8e2e-444853a927fd", 00:23:25.955 "assigned_rate_limits": { 00:23:25.955 "rw_ios_per_sec": 0, 00:23:25.955 "rw_mbytes_per_sec": 0, 00:23:25.955 "r_mbytes_per_sec": 0, 00:23:25.955 "w_mbytes_per_sec": 0 00:23:25.955 }, 00:23:25.955 "claimed": true, 00:23:25.955 "claim_type": "exclusive_write", 00:23:25.955 "zoned": false, 00:23:25.955 "supported_io_types": { 00:23:25.955 "read": true, 00:23:25.955 "write": true, 00:23:25.955 "unmap": true, 00:23:25.955 "flush": true, 00:23:25.955 "reset": true, 00:23:25.955 "nvme_admin": false, 00:23:25.955 "nvme_io": false, 00:23:25.955 "nvme_io_md": false, 00:23:25.955 "write_zeroes": true, 00:23:25.955 "zcopy": true, 00:23:25.955 "get_zone_info": false, 00:23:25.955 "zone_management": false, 00:23:25.955 "zone_append": false, 00:23:25.955 "compare": false, 00:23:25.955 "compare_and_write": false, 00:23:25.955 "abort": true, 00:23:25.955 "seek_hole": false, 00:23:25.955 "seek_data": false, 00:23:25.955 "copy": true, 00:23:25.955 "nvme_iov_md": false 00:23:25.955 }, 00:23:25.955 "memory_domains": [ 00:23:25.955 { 00:23:25.955 "dma_device_id": "system", 00:23:25.955 "dma_device_type": 1 00:23:25.955 }, 00:23:25.955 { 00:23:25.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.955 "dma_device_type": 2 00:23:25.955 } 00:23:25.955 ], 00:23:25.955 "driver_specific": {} 00:23:25.955 }' 00:23:25.955 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.955 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:25.955 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:25.955 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:25.955 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:26.214 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:26.473 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:26.473 "name": "BaseBdev4", 00:23:26.473 "aliases": [ 00:23:26.473 "9fe0a1f9-4e07-4705-8932-e75242501730" 00:23:26.473 ], 00:23:26.473 "product_name": "Malloc disk", 00:23:26.473 "block_size": 512, 00:23:26.473 "num_blocks": 65536, 00:23:26.473 "uuid": "9fe0a1f9-4e07-4705-8932-e75242501730", 00:23:26.473 "assigned_rate_limits": { 00:23:26.473 "rw_ios_per_sec": 0, 00:23:26.473 "rw_mbytes_per_sec": 0, 00:23:26.473 "r_mbytes_per_sec": 0, 00:23:26.473 "w_mbytes_per_sec": 0 00:23:26.473 }, 00:23:26.473 "claimed": true, 00:23:26.473 "claim_type": "exclusive_write", 00:23:26.473 "zoned": false, 00:23:26.473 "supported_io_types": { 00:23:26.473 "read": true, 00:23:26.473 "write": true, 00:23:26.473 "unmap": true, 00:23:26.473 "flush": true, 00:23:26.473 "reset": true, 00:23:26.473 "nvme_admin": false, 00:23:26.473 "nvme_io": false, 00:23:26.473 "nvme_io_md": false, 00:23:26.473 "write_zeroes": true, 00:23:26.473 "zcopy": true, 00:23:26.473 "get_zone_info": false, 00:23:26.473 "zone_management": false, 00:23:26.473 "zone_append": false, 00:23:26.473 "compare": false, 00:23:26.473 "compare_and_write": false, 00:23:26.473 "abort": true, 00:23:26.473 "seek_hole": false, 00:23:26.473 "seek_data": false, 00:23:26.473 "copy": true, 00:23:26.473 "nvme_iov_md": false 00:23:26.473 }, 00:23:26.473 "memory_domains": [ 00:23:26.473 { 00:23:26.473 "dma_device_id": "system", 00:23:26.473 "dma_device_type": 1 00:23:26.473 }, 00:23:26.473 { 00:23:26.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:26.473 "dma_device_type": 2 00:23:26.473 } 00:23:26.473 ], 00:23:26.473 "driver_specific": {} 00:23:26.473 }' 00:23:26.473 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.473 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:26.732 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:26.732 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.732 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:26.732 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:26.732 17:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.732 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:26.732 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:26.732 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.732 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:26.732 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:26.732 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:26.990 [2024-07-23 17:17:22.379597] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:26.990 [2024-07-23 17:17:22.379626] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:26.990 [2024-07-23 17:17:22.379680] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:26.990 [2024-07-23 17:17:22.379740] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:26.990 [2024-07-23 17:17:22.379758] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9608f0 name Existed_Raid, state offline 00:23:26.990 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4176876 00:23:26.990 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4176876 ']' 00:23:26.990 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4176876 00:23:26.990 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:26.990 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:26.990 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4176876 00:23:27.248 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:27.248 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:27.248 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4176876' 00:23:27.248 killing process with pid 4176876 00:23:27.248 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4176876 00:23:27.248 [2024-07-23 17:17:22.449594] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:27.248 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4176876 00:23:27.248 [2024-07-23 17:17:22.490716] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:27.522 17:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:23:27.522 00:23:27.522 real 0m36.571s 00:23:27.522 user 1m7.156s 00:23:27.522 sys 0m6.409s 00:23:27.522 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:27.522 17:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:27.522 ************************************ 00:23:27.522 END TEST raid_state_function_test_sb 00:23:27.522 ************************************ 00:23:27.522 17:17:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:27.522 17:17:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:23:27.522 17:17:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:23:27.522 17:17:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:27.522 17:17:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:27.522 ************************************ 00:23:27.522 START TEST raid_superblock_test 00:23:27.522 ************************************ 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:27.522 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4182273 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4182273 /var/tmp/spdk-raid.sock 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4182273 ']' 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:27.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:27.523 17:17:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.523 [2024-07-23 17:17:22.828526] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:23:27.523 [2024-07-23 17:17:22.828603] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4182273 ] 00:23:27.813 [2024-07-23 17:17:22.975237] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:27.813 [2024-07-23 17:17:23.026587] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:27.813 [2024-07-23 17:17:23.085815] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:27.813 [2024-07-23 17:17:23.085852] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:28.381 17:17:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:23:28.640 malloc1 00:23:28.640 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:28.899 [2024-07-23 17:17:24.231775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:28.899 [2024-07-23 17:17:24.231820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:28.899 [2024-07-23 17:17:24.231841] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb72070 00:23:28.899 [2024-07-23 17:17:24.231854] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:28.899 [2024-07-23 17:17:24.233472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:28.899 [2024-07-23 17:17:24.233500] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:28.899 pt1 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:28.899 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:23:29.158 malloc2 00:23:29.158 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:29.417 [2024-07-23 17:17:24.725791] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:29.417 [2024-07-23 17:17:24.725834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.417 [2024-07-23 17:17:24.725852] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa58920 00:23:29.417 [2024-07-23 17:17:24.725864] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.417 [2024-07-23 17:17:24.727547] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.417 [2024-07-23 17:17:24.727575] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:29.417 pt2 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:29.417 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:23:29.675 malloc3 00:23:29.675 17:17:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:29.934 [2024-07-23 17:17:25.223694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:29.935 [2024-07-23 17:17:25.223737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.935 [2024-07-23 17:17:25.223754] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6a3e0 00:23:29.935 [2024-07-23 17:17:25.223766] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.935 [2024-07-23 17:17:25.225306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.935 [2024-07-23 17:17:25.225334] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:29.935 pt3 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:29.935 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:23:30.193 malloc4 00:23:30.193 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:30.452 [2024-07-23 17:17:25.718834] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:30.452 [2024-07-23 17:17:25.718880] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.452 [2024-07-23 17:17:25.718901] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6c870 00:23:30.452 [2024-07-23 17:17:25.718914] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.452 [2024-07-23 17:17:25.720454] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.452 [2024-07-23 17:17:25.720483] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:30.452 pt4 00:23:30.452 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:30.452 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:30.452 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:23:30.710 [2024-07-23 17:17:25.963662] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:30.710 [2024-07-23 17:17:25.964995] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:30.710 [2024-07-23 17:17:25.965048] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:30.710 [2024-07-23 17:17:25.965091] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:30.711 [2024-07-23 17:17:25.965261] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xb6de80 00:23:30.711 [2024-07-23 17:17:25.965273] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:30.711 [2024-07-23 17:17:25.965474] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb6a670 00:23:30.711 [2024-07-23 17:17:25.965618] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb6de80 00:23:30.711 [2024-07-23 17:17:25.965628] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb6de80 00:23:30.711 [2024-07-23 17:17:25.965726] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.711 17:17:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.969 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.970 "name": "raid_bdev1", 00:23:30.970 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:30.970 "strip_size_kb": 64, 00:23:30.970 "state": "online", 00:23:30.970 "raid_level": "raid0", 00:23:30.970 "superblock": true, 00:23:30.970 "num_base_bdevs": 4, 00:23:30.970 "num_base_bdevs_discovered": 4, 00:23:30.970 "num_base_bdevs_operational": 4, 00:23:30.970 "base_bdevs_list": [ 00:23:30.970 { 00:23:30.970 "name": "pt1", 00:23:30.970 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:30.970 "is_configured": true, 00:23:30.970 "data_offset": 2048, 00:23:30.970 "data_size": 63488 00:23:30.970 }, 00:23:30.970 { 00:23:30.970 "name": "pt2", 00:23:30.970 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:30.970 "is_configured": true, 00:23:30.970 "data_offset": 2048, 00:23:30.970 "data_size": 63488 00:23:30.970 }, 00:23:30.970 { 00:23:30.970 "name": "pt3", 00:23:30.970 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:30.970 "is_configured": true, 00:23:30.970 "data_offset": 2048, 00:23:30.970 "data_size": 63488 00:23:30.970 }, 00:23:30.970 { 00:23:30.970 "name": "pt4", 00:23:30.970 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:30.970 "is_configured": true, 00:23:30.970 "data_offset": 2048, 00:23:30.970 "data_size": 63488 00:23:30.970 } 00:23:30.970 ] 00:23:30.970 }' 00:23:30.970 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.970 17:17:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:31.537 17:17:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:31.795 [2024-07-23 17:17:27.054828] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:31.795 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:31.795 "name": "raid_bdev1", 00:23:31.795 "aliases": [ 00:23:31.795 "31f7928f-3079-43d1-a905-75104b4db675" 00:23:31.795 ], 00:23:31.795 "product_name": "Raid Volume", 00:23:31.795 "block_size": 512, 00:23:31.795 "num_blocks": 253952, 00:23:31.795 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:31.795 "assigned_rate_limits": { 00:23:31.795 "rw_ios_per_sec": 0, 00:23:31.795 "rw_mbytes_per_sec": 0, 00:23:31.795 "r_mbytes_per_sec": 0, 00:23:31.795 "w_mbytes_per_sec": 0 00:23:31.795 }, 00:23:31.795 "claimed": false, 00:23:31.795 "zoned": false, 00:23:31.795 "supported_io_types": { 00:23:31.796 "read": true, 00:23:31.796 "write": true, 00:23:31.796 "unmap": true, 00:23:31.796 "flush": true, 00:23:31.796 "reset": true, 00:23:31.796 "nvme_admin": false, 00:23:31.796 "nvme_io": false, 00:23:31.796 "nvme_io_md": false, 00:23:31.796 "write_zeroes": true, 00:23:31.796 "zcopy": false, 00:23:31.796 "get_zone_info": false, 00:23:31.796 "zone_management": false, 00:23:31.796 "zone_append": false, 00:23:31.796 "compare": false, 00:23:31.796 "compare_and_write": false, 00:23:31.796 "abort": false, 00:23:31.796 "seek_hole": false, 00:23:31.796 "seek_data": false, 00:23:31.796 "copy": false, 00:23:31.796 "nvme_iov_md": false 00:23:31.796 }, 00:23:31.796 "memory_domains": [ 00:23:31.796 { 00:23:31.796 "dma_device_id": "system", 00:23:31.796 "dma_device_type": 1 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.796 "dma_device_type": 2 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "system", 00:23:31.796 "dma_device_type": 1 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.796 "dma_device_type": 2 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "system", 00:23:31.796 "dma_device_type": 1 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.796 "dma_device_type": 2 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "system", 00:23:31.796 "dma_device_type": 1 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.796 "dma_device_type": 2 00:23:31.796 } 00:23:31.796 ], 00:23:31.796 "driver_specific": { 00:23:31.796 "raid": { 00:23:31.796 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:31.796 "strip_size_kb": 64, 00:23:31.796 "state": "online", 00:23:31.796 "raid_level": "raid0", 00:23:31.796 "superblock": true, 00:23:31.796 "num_base_bdevs": 4, 00:23:31.796 "num_base_bdevs_discovered": 4, 00:23:31.796 "num_base_bdevs_operational": 4, 00:23:31.796 "base_bdevs_list": [ 00:23:31.796 { 00:23:31.796 "name": "pt1", 00:23:31.796 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:31.796 "is_configured": true, 00:23:31.796 "data_offset": 2048, 00:23:31.796 "data_size": 63488 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "name": "pt2", 00:23:31.796 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:31.796 "is_configured": true, 00:23:31.796 "data_offset": 2048, 00:23:31.796 "data_size": 63488 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "name": "pt3", 00:23:31.796 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:31.796 "is_configured": true, 00:23:31.796 "data_offset": 2048, 00:23:31.796 "data_size": 63488 00:23:31.796 }, 00:23:31.796 { 00:23:31.796 "name": "pt4", 00:23:31.796 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:31.796 "is_configured": true, 00:23:31.796 "data_offset": 2048, 00:23:31.796 "data_size": 63488 00:23:31.796 } 00:23:31.796 ] 00:23:31.796 } 00:23:31.796 } 00:23:31.796 }' 00:23:31.796 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:31.796 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:31.796 pt2 00:23:31.796 pt3 00:23:31.796 pt4' 00:23:31.796 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:31.796 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:31.796 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:32.054 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:32.054 "name": "pt1", 00:23:32.054 "aliases": [ 00:23:32.054 "00000000-0000-0000-0000-000000000001" 00:23:32.054 ], 00:23:32.054 "product_name": "passthru", 00:23:32.054 "block_size": 512, 00:23:32.054 "num_blocks": 65536, 00:23:32.054 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:32.054 "assigned_rate_limits": { 00:23:32.054 "rw_ios_per_sec": 0, 00:23:32.054 "rw_mbytes_per_sec": 0, 00:23:32.054 "r_mbytes_per_sec": 0, 00:23:32.054 "w_mbytes_per_sec": 0 00:23:32.054 }, 00:23:32.054 "claimed": true, 00:23:32.054 "claim_type": "exclusive_write", 00:23:32.054 "zoned": false, 00:23:32.054 "supported_io_types": { 00:23:32.054 "read": true, 00:23:32.054 "write": true, 00:23:32.054 "unmap": true, 00:23:32.054 "flush": true, 00:23:32.054 "reset": true, 00:23:32.054 "nvme_admin": false, 00:23:32.054 "nvme_io": false, 00:23:32.054 "nvme_io_md": false, 00:23:32.054 "write_zeroes": true, 00:23:32.054 "zcopy": true, 00:23:32.054 "get_zone_info": false, 00:23:32.054 "zone_management": false, 00:23:32.054 "zone_append": false, 00:23:32.054 "compare": false, 00:23:32.054 "compare_and_write": false, 00:23:32.054 "abort": true, 00:23:32.054 "seek_hole": false, 00:23:32.054 "seek_data": false, 00:23:32.054 "copy": true, 00:23:32.054 "nvme_iov_md": false 00:23:32.054 }, 00:23:32.054 "memory_domains": [ 00:23:32.054 { 00:23:32.054 "dma_device_id": "system", 00:23:32.054 "dma_device_type": 1 00:23:32.054 }, 00:23:32.054 { 00:23:32.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:32.054 "dma_device_type": 2 00:23:32.054 } 00:23:32.054 ], 00:23:32.054 "driver_specific": { 00:23:32.054 "passthru": { 00:23:32.054 "name": "pt1", 00:23:32.054 "base_bdev_name": "malloc1" 00:23:32.054 } 00:23:32.054 } 00:23:32.054 }' 00:23:32.054 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.054 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.054 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:32.054 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:32.313 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:32.572 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:32.572 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:32.572 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:32.572 "name": "pt2", 00:23:32.572 "aliases": [ 00:23:32.572 "00000000-0000-0000-0000-000000000002" 00:23:32.572 ], 00:23:32.572 "product_name": "passthru", 00:23:32.572 "block_size": 512, 00:23:32.572 "num_blocks": 65536, 00:23:32.572 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:32.572 "assigned_rate_limits": { 00:23:32.572 "rw_ios_per_sec": 0, 00:23:32.572 "rw_mbytes_per_sec": 0, 00:23:32.572 "r_mbytes_per_sec": 0, 00:23:32.572 "w_mbytes_per_sec": 0 00:23:32.572 }, 00:23:32.572 "claimed": true, 00:23:32.572 "claim_type": "exclusive_write", 00:23:32.572 "zoned": false, 00:23:32.572 "supported_io_types": { 00:23:32.572 "read": true, 00:23:32.572 "write": true, 00:23:32.572 "unmap": true, 00:23:32.572 "flush": true, 00:23:32.572 "reset": true, 00:23:32.572 "nvme_admin": false, 00:23:32.572 "nvme_io": false, 00:23:32.572 "nvme_io_md": false, 00:23:32.572 "write_zeroes": true, 00:23:32.572 "zcopy": true, 00:23:32.572 "get_zone_info": false, 00:23:32.572 "zone_management": false, 00:23:32.572 "zone_append": false, 00:23:32.572 "compare": false, 00:23:32.572 "compare_and_write": false, 00:23:32.572 "abort": true, 00:23:32.572 "seek_hole": false, 00:23:32.572 "seek_data": false, 00:23:32.572 "copy": true, 00:23:32.572 "nvme_iov_md": false 00:23:32.572 }, 00:23:32.572 "memory_domains": [ 00:23:32.572 { 00:23:32.572 "dma_device_id": "system", 00:23:32.572 "dma_device_type": 1 00:23:32.572 }, 00:23:32.572 { 00:23:32.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:32.572 "dma_device_type": 2 00:23:32.572 } 00:23:32.572 ], 00:23:32.572 "driver_specific": { 00:23:32.572 "passthru": { 00:23:32.572 "name": "pt2", 00:23:32.572 "base_bdev_name": "malloc2" 00:23:32.572 } 00:23:32.572 } 00:23:32.572 }' 00:23:32.572 17:17:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:32.830 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:33.089 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:33.089 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:33.089 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:33.089 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:33.089 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:33.348 "name": "pt3", 00:23:33.348 "aliases": [ 00:23:33.348 "00000000-0000-0000-0000-000000000003" 00:23:33.348 ], 00:23:33.348 "product_name": "passthru", 00:23:33.348 "block_size": 512, 00:23:33.348 "num_blocks": 65536, 00:23:33.348 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:33.348 "assigned_rate_limits": { 00:23:33.348 "rw_ios_per_sec": 0, 00:23:33.348 "rw_mbytes_per_sec": 0, 00:23:33.348 "r_mbytes_per_sec": 0, 00:23:33.348 "w_mbytes_per_sec": 0 00:23:33.348 }, 00:23:33.348 "claimed": true, 00:23:33.348 "claim_type": "exclusive_write", 00:23:33.348 "zoned": false, 00:23:33.348 "supported_io_types": { 00:23:33.348 "read": true, 00:23:33.348 "write": true, 00:23:33.348 "unmap": true, 00:23:33.348 "flush": true, 00:23:33.348 "reset": true, 00:23:33.348 "nvme_admin": false, 00:23:33.348 "nvme_io": false, 00:23:33.348 "nvme_io_md": false, 00:23:33.348 "write_zeroes": true, 00:23:33.348 "zcopy": true, 00:23:33.348 "get_zone_info": false, 00:23:33.348 "zone_management": false, 00:23:33.348 "zone_append": false, 00:23:33.348 "compare": false, 00:23:33.348 "compare_and_write": false, 00:23:33.348 "abort": true, 00:23:33.348 "seek_hole": false, 00:23:33.348 "seek_data": false, 00:23:33.348 "copy": true, 00:23:33.348 "nvme_iov_md": false 00:23:33.348 }, 00:23:33.348 "memory_domains": [ 00:23:33.348 { 00:23:33.348 "dma_device_id": "system", 00:23:33.348 "dma_device_type": 1 00:23:33.348 }, 00:23:33.348 { 00:23:33.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:33.348 "dma_device_type": 2 00:23:33.348 } 00:23:33.348 ], 00:23:33.348 "driver_specific": { 00:23:33.348 "passthru": { 00:23:33.348 "name": "pt3", 00:23:33.348 "base_bdev_name": "malloc3" 00:23:33.348 } 00:23:33.348 } 00:23:33.348 }' 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:33.348 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:33.607 17:17:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:33.867 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:33.867 "name": "pt4", 00:23:33.867 "aliases": [ 00:23:33.867 "00000000-0000-0000-0000-000000000004" 00:23:33.867 ], 00:23:33.867 "product_name": "passthru", 00:23:33.867 "block_size": 512, 00:23:33.867 "num_blocks": 65536, 00:23:33.867 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:33.867 "assigned_rate_limits": { 00:23:33.867 "rw_ios_per_sec": 0, 00:23:33.867 "rw_mbytes_per_sec": 0, 00:23:33.867 "r_mbytes_per_sec": 0, 00:23:33.867 "w_mbytes_per_sec": 0 00:23:33.867 }, 00:23:33.867 "claimed": true, 00:23:33.867 "claim_type": "exclusive_write", 00:23:33.867 "zoned": false, 00:23:33.867 "supported_io_types": { 00:23:33.867 "read": true, 00:23:33.867 "write": true, 00:23:33.867 "unmap": true, 00:23:33.867 "flush": true, 00:23:33.867 "reset": true, 00:23:33.867 "nvme_admin": false, 00:23:33.867 "nvme_io": false, 00:23:33.867 "nvme_io_md": false, 00:23:33.867 "write_zeroes": true, 00:23:33.867 "zcopy": true, 00:23:33.867 "get_zone_info": false, 00:23:33.867 "zone_management": false, 00:23:33.867 "zone_append": false, 00:23:33.867 "compare": false, 00:23:33.867 "compare_and_write": false, 00:23:33.867 "abort": true, 00:23:33.867 "seek_hole": false, 00:23:33.867 "seek_data": false, 00:23:33.867 "copy": true, 00:23:33.867 "nvme_iov_md": false 00:23:33.867 }, 00:23:33.867 "memory_domains": [ 00:23:33.867 { 00:23:33.867 "dma_device_id": "system", 00:23:33.867 "dma_device_type": 1 00:23:33.867 }, 00:23:33.867 { 00:23:33.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:33.867 "dma_device_type": 2 00:23:33.867 } 00:23:33.867 ], 00:23:33.867 "driver_specific": { 00:23:33.867 "passthru": { 00:23:33.867 "name": "pt4", 00:23:33.867 "base_bdev_name": "malloc4" 00:23:33.867 } 00:23:33.867 } 00:23:33.867 }' 00:23:33.867 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:33.867 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:33.867 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:33.867 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:34.126 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:34.385 [2024-07-23 17:17:29.761996] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:34.385 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=31f7928f-3079-43d1-a905-75104b4db675 00:23:34.385 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 31f7928f-3079-43d1-a905-75104b4db675 ']' 00:23:34.385 17:17:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:34.643 [2024-07-23 17:17:30.014355] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:34.643 [2024-07-23 17:17:30.014376] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:34.643 [2024-07-23 17:17:30.014424] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:34.643 [2024-07-23 17:17:30.014488] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:34.643 [2024-07-23 17:17:30.014499] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb6de80 name raid_bdev1, state offline 00:23:34.643 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.643 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:34.902 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:34.902 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:34.902 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:34.902 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:35.161 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:35.161 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:35.419 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:35.420 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:35.679 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:35.679 17:17:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:35.938 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:35.938 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:36.197 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:36.456 [2024-07-23 17:17:31.642579] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:36.456 [2024-07-23 17:17:31.643916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:36.456 [2024-07-23 17:17:31.643959] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:23:36.456 [2024-07-23 17:17:31.643992] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:23:36.456 [2024-07-23 17:17:31.644035] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:36.456 [2024-07-23 17:17:31.644072] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:36.456 [2024-07-23 17:17:31.644094] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:23:36.456 [2024-07-23 17:17:31.644117] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:23:36.456 [2024-07-23 17:17:31.644134] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:36.456 [2024-07-23 17:17:31.644144] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa58090 name raid_bdev1, state configuring 00:23:36.456 request: 00:23:36.456 { 00:23:36.456 "name": "raid_bdev1", 00:23:36.456 "raid_level": "raid0", 00:23:36.456 "base_bdevs": [ 00:23:36.456 "malloc1", 00:23:36.456 "malloc2", 00:23:36.456 "malloc3", 00:23:36.456 "malloc4" 00:23:36.456 ], 00:23:36.456 "strip_size_kb": 64, 00:23:36.456 "superblock": false, 00:23:36.456 "method": "bdev_raid_create", 00:23:36.456 "req_id": 1 00:23:36.456 } 00:23:36.456 Got JSON-RPC error response 00:23:36.456 response: 00:23:36.456 { 00:23:36.457 "code": -17, 00:23:36.457 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:36.457 } 00:23:36.457 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:23:36.457 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:36.457 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:36.457 17:17:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:36.457 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.457 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:36.715 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:36.715 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:36.715 17:17:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:36.715 [2024-07-23 17:17:32.135951] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:36.715 [2024-07-23 17:17:32.135996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:36.715 [2024-07-23 17:17:32.136013] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb723b0 00:23:36.715 [2024-07-23 17:17:32.136032] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:36.974 [2024-07-23 17:17:32.137587] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:36.974 [2024-07-23 17:17:32.137615] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:36.974 [2024-07-23 17:17:32.137682] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:36.974 [2024-07-23 17:17:32.137708] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:36.974 pt1 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.974 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.233 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:37.233 "name": "raid_bdev1", 00:23:37.233 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:37.233 "strip_size_kb": 64, 00:23:37.233 "state": "configuring", 00:23:37.233 "raid_level": "raid0", 00:23:37.233 "superblock": true, 00:23:37.233 "num_base_bdevs": 4, 00:23:37.233 "num_base_bdevs_discovered": 1, 00:23:37.233 "num_base_bdevs_operational": 4, 00:23:37.233 "base_bdevs_list": [ 00:23:37.233 { 00:23:37.233 "name": "pt1", 00:23:37.233 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:37.233 "is_configured": true, 00:23:37.233 "data_offset": 2048, 00:23:37.233 "data_size": 63488 00:23:37.233 }, 00:23:37.233 { 00:23:37.233 "name": null, 00:23:37.233 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:37.233 "is_configured": false, 00:23:37.233 "data_offset": 2048, 00:23:37.233 "data_size": 63488 00:23:37.233 }, 00:23:37.233 { 00:23:37.233 "name": null, 00:23:37.233 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:37.233 "is_configured": false, 00:23:37.233 "data_offset": 2048, 00:23:37.233 "data_size": 63488 00:23:37.233 }, 00:23:37.233 { 00:23:37.233 "name": null, 00:23:37.233 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:37.233 "is_configured": false, 00:23:37.233 "data_offset": 2048, 00:23:37.233 "data_size": 63488 00:23:37.233 } 00:23:37.233 ] 00:23:37.233 }' 00:23:37.233 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:37.233 17:17:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:37.800 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:23:37.800 17:17:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:37.800 [2024-07-23 17:17:33.210807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:37.800 [2024-07-23 17:17:33.210848] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:37.800 [2024-07-23 17:17:33.210866] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c1470 00:23:37.801 [2024-07-23 17:17:33.210878] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:37.801 [2024-07-23 17:17:33.211193] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:37.801 [2024-07-23 17:17:33.211211] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:37.801 [2024-07-23 17:17:33.211268] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:37.801 [2024-07-23 17:17:33.211291] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:37.801 pt2 00:23:38.058 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:38.316 [2024-07-23 17:17:33.495573] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.316 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.575 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.575 "name": "raid_bdev1", 00:23:38.575 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:38.575 "strip_size_kb": 64, 00:23:38.575 "state": "configuring", 00:23:38.575 "raid_level": "raid0", 00:23:38.575 "superblock": true, 00:23:38.575 "num_base_bdevs": 4, 00:23:38.575 "num_base_bdevs_discovered": 1, 00:23:38.575 "num_base_bdevs_operational": 4, 00:23:38.575 "base_bdevs_list": [ 00:23:38.575 { 00:23:38.575 "name": "pt1", 00:23:38.575 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:38.575 "is_configured": true, 00:23:38.575 "data_offset": 2048, 00:23:38.575 "data_size": 63488 00:23:38.575 }, 00:23:38.575 { 00:23:38.575 "name": null, 00:23:38.575 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:38.575 "is_configured": false, 00:23:38.575 "data_offset": 2048, 00:23:38.575 "data_size": 63488 00:23:38.575 }, 00:23:38.575 { 00:23:38.575 "name": null, 00:23:38.575 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:38.575 "is_configured": false, 00:23:38.575 "data_offset": 2048, 00:23:38.575 "data_size": 63488 00:23:38.575 }, 00:23:38.575 { 00:23:38.575 "name": null, 00:23:38.575 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:38.575 "is_configured": false, 00:23:38.575 "data_offset": 2048, 00:23:38.575 "data_size": 63488 00:23:38.575 } 00:23:38.575 ] 00:23:38.575 }' 00:23:38.575 17:17:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.575 17:17:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:39.141 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:39.141 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:39.141 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:39.400 [2024-07-23 17:17:34.594479] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:39.400 [2024-07-23 17:17:34.594526] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.400 [2024-07-23 17:17:34.594544] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb73850 00:23:39.400 [2024-07-23 17:17:34.594556] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.400 [2024-07-23 17:17:34.594877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.400 [2024-07-23 17:17:34.594902] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:39.400 [2024-07-23 17:17:34.594960] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:39.400 [2024-07-23 17:17:34.594984] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:39.400 pt2 00:23:39.400 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:39.400 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:39.400 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:39.400 [2024-07-23 17:17:34.778972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:39.400 [2024-07-23 17:17:34.779003] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.400 [2024-07-23 17:17:34.779018] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c0dc0 00:23:39.400 [2024-07-23 17:17:34.779030] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.400 [2024-07-23 17:17:34.779308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.400 [2024-07-23 17:17:34.779325] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:39.400 [2024-07-23 17:17:34.779376] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:39.400 [2024-07-23 17:17:34.779393] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:39.400 pt3 00:23:39.400 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:39.400 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:39.400 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:39.659 [2024-07-23 17:17:34.967480] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:39.659 [2024-07-23 17:17:34.967516] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.659 [2024-07-23 17:17:34.967532] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c1890 00:23:39.659 [2024-07-23 17:17:34.967544] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.659 [2024-07-23 17:17:34.967832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.659 [2024-07-23 17:17:34.967848] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:39.659 [2024-07-23 17:17:34.967906] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:39.659 [2024-07-23 17:17:34.967925] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:39.659 [2024-07-23 17:17:34.968037] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xb71850 00:23:39.659 [2024-07-23 17:17:34.968047] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:39.659 [2024-07-23 17:17:34.968207] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb70030 00:23:39.659 [2024-07-23 17:17:34.968328] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb71850 00:23:39.659 [2024-07-23 17:17:34.968337] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb71850 00:23:39.659 [2024-07-23 17:17:34.968426] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.659 pt4 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.659 17:17:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.918 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.918 "name": "raid_bdev1", 00:23:39.918 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:39.918 "strip_size_kb": 64, 00:23:39.918 "state": "online", 00:23:39.918 "raid_level": "raid0", 00:23:39.918 "superblock": true, 00:23:39.918 "num_base_bdevs": 4, 00:23:39.918 "num_base_bdevs_discovered": 4, 00:23:39.918 "num_base_bdevs_operational": 4, 00:23:39.918 "base_bdevs_list": [ 00:23:39.918 { 00:23:39.918 "name": "pt1", 00:23:39.918 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:39.918 "is_configured": true, 00:23:39.918 "data_offset": 2048, 00:23:39.918 "data_size": 63488 00:23:39.918 }, 00:23:39.918 { 00:23:39.918 "name": "pt2", 00:23:39.918 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:39.918 "is_configured": true, 00:23:39.918 "data_offset": 2048, 00:23:39.918 "data_size": 63488 00:23:39.918 }, 00:23:39.918 { 00:23:39.918 "name": "pt3", 00:23:39.918 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:39.918 "is_configured": true, 00:23:39.918 "data_offset": 2048, 00:23:39.918 "data_size": 63488 00:23:39.918 }, 00:23:39.918 { 00:23:39.918 "name": "pt4", 00:23:39.918 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:39.918 "is_configured": true, 00:23:39.918 "data_offset": 2048, 00:23:39.918 "data_size": 63488 00:23:39.918 } 00:23:39.918 ] 00:23:39.918 }' 00:23:39.918 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.918 17:17:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:40.487 17:17:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:41.054 [2024-07-23 17:17:36.387559] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:41.054 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:41.054 "name": "raid_bdev1", 00:23:41.054 "aliases": [ 00:23:41.054 "31f7928f-3079-43d1-a905-75104b4db675" 00:23:41.054 ], 00:23:41.054 "product_name": "Raid Volume", 00:23:41.054 "block_size": 512, 00:23:41.054 "num_blocks": 253952, 00:23:41.054 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:41.054 "assigned_rate_limits": { 00:23:41.054 "rw_ios_per_sec": 0, 00:23:41.054 "rw_mbytes_per_sec": 0, 00:23:41.054 "r_mbytes_per_sec": 0, 00:23:41.054 "w_mbytes_per_sec": 0 00:23:41.054 }, 00:23:41.054 "claimed": false, 00:23:41.054 "zoned": false, 00:23:41.054 "supported_io_types": { 00:23:41.054 "read": true, 00:23:41.054 "write": true, 00:23:41.054 "unmap": true, 00:23:41.054 "flush": true, 00:23:41.054 "reset": true, 00:23:41.054 "nvme_admin": false, 00:23:41.054 "nvme_io": false, 00:23:41.054 "nvme_io_md": false, 00:23:41.054 "write_zeroes": true, 00:23:41.054 "zcopy": false, 00:23:41.054 "get_zone_info": false, 00:23:41.054 "zone_management": false, 00:23:41.054 "zone_append": false, 00:23:41.054 "compare": false, 00:23:41.054 "compare_and_write": false, 00:23:41.054 "abort": false, 00:23:41.054 "seek_hole": false, 00:23:41.054 "seek_data": false, 00:23:41.054 "copy": false, 00:23:41.054 "nvme_iov_md": false 00:23:41.054 }, 00:23:41.054 "memory_domains": [ 00:23:41.054 { 00:23:41.054 "dma_device_id": "system", 00:23:41.054 "dma_device_type": 1 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.054 "dma_device_type": 2 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "system", 00:23:41.054 "dma_device_type": 1 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.054 "dma_device_type": 2 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "system", 00:23:41.054 "dma_device_type": 1 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.054 "dma_device_type": 2 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "system", 00:23:41.054 "dma_device_type": 1 00:23:41.054 }, 00:23:41.054 { 00:23:41.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.054 "dma_device_type": 2 00:23:41.054 } 00:23:41.054 ], 00:23:41.054 "driver_specific": { 00:23:41.054 "raid": { 00:23:41.054 "uuid": "31f7928f-3079-43d1-a905-75104b4db675", 00:23:41.054 "strip_size_kb": 64, 00:23:41.054 "state": "online", 00:23:41.054 "raid_level": "raid0", 00:23:41.054 "superblock": true, 00:23:41.055 "num_base_bdevs": 4, 00:23:41.055 "num_base_bdevs_discovered": 4, 00:23:41.055 "num_base_bdevs_operational": 4, 00:23:41.055 "base_bdevs_list": [ 00:23:41.055 { 00:23:41.055 "name": "pt1", 00:23:41.055 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:41.055 "is_configured": true, 00:23:41.055 "data_offset": 2048, 00:23:41.055 "data_size": 63488 00:23:41.055 }, 00:23:41.055 { 00:23:41.055 "name": "pt2", 00:23:41.055 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:41.055 "is_configured": true, 00:23:41.055 "data_offset": 2048, 00:23:41.055 "data_size": 63488 00:23:41.055 }, 00:23:41.055 { 00:23:41.055 "name": "pt3", 00:23:41.055 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:41.055 "is_configured": true, 00:23:41.055 "data_offset": 2048, 00:23:41.055 "data_size": 63488 00:23:41.055 }, 00:23:41.055 { 00:23:41.055 "name": "pt4", 00:23:41.055 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:41.055 "is_configured": true, 00:23:41.055 "data_offset": 2048, 00:23:41.055 "data_size": 63488 00:23:41.055 } 00:23:41.055 ] 00:23:41.055 } 00:23:41.055 } 00:23:41.055 }' 00:23:41.055 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:41.055 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:41.055 pt2 00:23:41.055 pt3 00:23:41.055 pt4' 00:23:41.055 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:41.055 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:41.055 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:41.313 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:41.313 "name": "pt1", 00:23:41.313 "aliases": [ 00:23:41.313 "00000000-0000-0000-0000-000000000001" 00:23:41.313 ], 00:23:41.313 "product_name": "passthru", 00:23:41.313 "block_size": 512, 00:23:41.313 "num_blocks": 65536, 00:23:41.313 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:41.313 "assigned_rate_limits": { 00:23:41.313 "rw_ios_per_sec": 0, 00:23:41.313 "rw_mbytes_per_sec": 0, 00:23:41.313 "r_mbytes_per_sec": 0, 00:23:41.313 "w_mbytes_per_sec": 0 00:23:41.313 }, 00:23:41.313 "claimed": true, 00:23:41.313 "claim_type": "exclusive_write", 00:23:41.313 "zoned": false, 00:23:41.313 "supported_io_types": { 00:23:41.313 "read": true, 00:23:41.313 "write": true, 00:23:41.313 "unmap": true, 00:23:41.313 "flush": true, 00:23:41.313 "reset": true, 00:23:41.313 "nvme_admin": false, 00:23:41.313 "nvme_io": false, 00:23:41.313 "nvme_io_md": false, 00:23:41.313 "write_zeroes": true, 00:23:41.313 "zcopy": true, 00:23:41.313 "get_zone_info": false, 00:23:41.313 "zone_management": false, 00:23:41.313 "zone_append": false, 00:23:41.313 "compare": false, 00:23:41.313 "compare_and_write": false, 00:23:41.313 "abort": true, 00:23:41.313 "seek_hole": false, 00:23:41.313 "seek_data": false, 00:23:41.313 "copy": true, 00:23:41.313 "nvme_iov_md": false 00:23:41.314 }, 00:23:41.314 "memory_domains": [ 00:23:41.314 { 00:23:41.314 "dma_device_id": "system", 00:23:41.314 "dma_device_type": 1 00:23:41.314 }, 00:23:41.314 { 00:23:41.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.314 "dma_device_type": 2 00:23:41.314 } 00:23:41.314 ], 00:23:41.314 "driver_specific": { 00:23:41.314 "passthru": { 00:23:41.314 "name": "pt1", 00:23:41.314 "base_bdev_name": "malloc1" 00:23:41.314 } 00:23:41.314 } 00:23:41.314 }' 00:23:41.314 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:41.572 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:41.572 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:41.572 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:41.572 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:41.572 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:41.572 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:41.831 17:17:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:41.831 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:42.090 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:42.090 "name": "pt2", 00:23:42.090 "aliases": [ 00:23:42.090 "00000000-0000-0000-0000-000000000002" 00:23:42.090 ], 00:23:42.090 "product_name": "passthru", 00:23:42.090 "block_size": 512, 00:23:42.090 "num_blocks": 65536, 00:23:42.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:42.090 "assigned_rate_limits": { 00:23:42.090 "rw_ios_per_sec": 0, 00:23:42.090 "rw_mbytes_per_sec": 0, 00:23:42.090 "r_mbytes_per_sec": 0, 00:23:42.090 "w_mbytes_per_sec": 0 00:23:42.090 }, 00:23:42.090 "claimed": true, 00:23:42.090 "claim_type": "exclusive_write", 00:23:42.090 "zoned": false, 00:23:42.090 "supported_io_types": { 00:23:42.090 "read": true, 00:23:42.090 "write": true, 00:23:42.090 "unmap": true, 00:23:42.090 "flush": true, 00:23:42.090 "reset": true, 00:23:42.090 "nvme_admin": false, 00:23:42.090 "nvme_io": false, 00:23:42.090 "nvme_io_md": false, 00:23:42.090 "write_zeroes": true, 00:23:42.090 "zcopy": true, 00:23:42.090 "get_zone_info": false, 00:23:42.090 "zone_management": false, 00:23:42.090 "zone_append": false, 00:23:42.090 "compare": false, 00:23:42.090 "compare_and_write": false, 00:23:42.090 "abort": true, 00:23:42.090 "seek_hole": false, 00:23:42.090 "seek_data": false, 00:23:42.090 "copy": true, 00:23:42.090 "nvme_iov_md": false 00:23:42.090 }, 00:23:42.090 "memory_domains": [ 00:23:42.090 { 00:23:42.090 "dma_device_id": "system", 00:23:42.090 "dma_device_type": 1 00:23:42.090 }, 00:23:42.090 { 00:23:42.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.090 "dma_device_type": 2 00:23:42.090 } 00:23:42.090 ], 00:23:42.090 "driver_specific": { 00:23:42.090 "passthru": { 00:23:42.090 "name": "pt2", 00:23:42.090 "base_bdev_name": "malloc2" 00:23:42.090 } 00:23:42.090 } 00:23:42.090 }' 00:23:42.090 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:42.090 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:42.090 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:42.090 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:42.349 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:42.608 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:42.608 17:17:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:42.608 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:42.608 "name": "pt3", 00:23:42.608 "aliases": [ 00:23:42.608 "00000000-0000-0000-0000-000000000003" 00:23:42.608 ], 00:23:42.608 "product_name": "passthru", 00:23:42.608 "block_size": 512, 00:23:42.608 "num_blocks": 65536, 00:23:42.608 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:42.608 "assigned_rate_limits": { 00:23:42.608 "rw_ios_per_sec": 0, 00:23:42.608 "rw_mbytes_per_sec": 0, 00:23:42.608 "r_mbytes_per_sec": 0, 00:23:42.608 "w_mbytes_per_sec": 0 00:23:42.608 }, 00:23:42.608 "claimed": true, 00:23:42.608 "claim_type": "exclusive_write", 00:23:42.608 "zoned": false, 00:23:42.608 "supported_io_types": { 00:23:42.608 "read": true, 00:23:42.608 "write": true, 00:23:42.608 "unmap": true, 00:23:42.608 "flush": true, 00:23:42.608 "reset": true, 00:23:42.608 "nvme_admin": false, 00:23:42.608 "nvme_io": false, 00:23:42.608 "nvme_io_md": false, 00:23:42.608 "write_zeroes": true, 00:23:42.608 "zcopy": true, 00:23:42.608 "get_zone_info": false, 00:23:42.608 "zone_management": false, 00:23:42.608 "zone_append": false, 00:23:42.608 "compare": false, 00:23:42.608 "compare_and_write": false, 00:23:42.608 "abort": true, 00:23:42.608 "seek_hole": false, 00:23:42.608 "seek_data": false, 00:23:42.608 "copy": true, 00:23:42.608 "nvme_iov_md": false 00:23:42.608 }, 00:23:42.608 "memory_domains": [ 00:23:42.608 { 00:23:42.608 "dma_device_id": "system", 00:23:42.608 "dma_device_type": 1 00:23:42.608 }, 00:23:42.608 { 00:23:42.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.608 "dma_device_type": 2 00:23:42.608 } 00:23:42.608 ], 00:23:42.608 "driver_specific": { 00:23:42.608 "passthru": { 00:23:42.608 "name": "pt3", 00:23:42.608 "base_bdev_name": "malloc3" 00:23:42.608 } 00:23:42.608 } 00:23:42.608 }' 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:42.867 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:43.126 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:43.385 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:43.385 "name": "pt4", 00:23:43.385 "aliases": [ 00:23:43.385 "00000000-0000-0000-0000-000000000004" 00:23:43.385 ], 00:23:43.385 "product_name": "passthru", 00:23:43.385 "block_size": 512, 00:23:43.385 "num_blocks": 65536, 00:23:43.385 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:43.385 "assigned_rate_limits": { 00:23:43.385 "rw_ios_per_sec": 0, 00:23:43.385 "rw_mbytes_per_sec": 0, 00:23:43.385 "r_mbytes_per_sec": 0, 00:23:43.385 "w_mbytes_per_sec": 0 00:23:43.385 }, 00:23:43.385 "claimed": true, 00:23:43.385 "claim_type": "exclusive_write", 00:23:43.385 "zoned": false, 00:23:43.385 "supported_io_types": { 00:23:43.385 "read": true, 00:23:43.385 "write": true, 00:23:43.385 "unmap": true, 00:23:43.385 "flush": true, 00:23:43.385 "reset": true, 00:23:43.385 "nvme_admin": false, 00:23:43.385 "nvme_io": false, 00:23:43.385 "nvme_io_md": false, 00:23:43.385 "write_zeroes": true, 00:23:43.385 "zcopy": true, 00:23:43.385 "get_zone_info": false, 00:23:43.385 "zone_management": false, 00:23:43.385 "zone_append": false, 00:23:43.385 "compare": false, 00:23:43.385 "compare_and_write": false, 00:23:43.385 "abort": true, 00:23:43.385 "seek_hole": false, 00:23:43.385 "seek_data": false, 00:23:43.385 "copy": true, 00:23:43.385 "nvme_iov_md": false 00:23:43.385 }, 00:23:43.385 "memory_domains": [ 00:23:43.385 { 00:23:43.385 "dma_device_id": "system", 00:23:43.385 "dma_device_type": 1 00:23:43.385 }, 00:23:43.385 { 00:23:43.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:43.385 "dma_device_type": 2 00:23:43.385 } 00:23:43.385 ], 00:23:43.385 "driver_specific": { 00:23:43.385 "passthru": { 00:23:43.385 "name": "pt4", 00:23:43.385 "base_bdev_name": "malloc4" 00:23:43.385 } 00:23:43.385 } 00:23:43.385 }' 00:23:43.385 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.385 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.385 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:43.385 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.385 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.643 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:43.643 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.643 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.643 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:43.643 17:17:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.643 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:43.935 [2024-07-23 17:17:39.303279] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 31f7928f-3079-43d1-a905-75104b4db675 '!=' 31f7928f-3079-43d1-a905-75104b4db675 ']' 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4182273 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4182273 ']' 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4182273 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:43.935 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4182273 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4182273' 00:23:44.194 killing process with pid 4182273 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4182273 00:23:44.194 [2024-07-23 17:17:39.376912] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:44.194 [2024-07-23 17:17:39.376976] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4182273 00:23:44.194 [2024-07-23 17:17:39.377040] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:44.194 [2024-07-23 17:17:39.377053] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb71850 name raid_bdev1, state offline 00:23:44.194 [2024-07-23 17:17:39.413249] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:23:44.194 00:23:44.194 real 0m16.838s 00:23:44.194 user 0m30.492s 00:23:44.194 sys 0m2.983s 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:44.194 17:17:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:44.194 ************************************ 00:23:44.194 END TEST raid_superblock_test 00:23:44.194 ************************************ 00:23:44.453 17:17:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:44.453 17:17:39 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:23:44.453 17:17:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:44.453 17:17:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:44.453 17:17:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:44.453 ************************************ 00:23:44.453 START TEST raid_read_error_test 00:23:44.453 ************************************ 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FBeziHxNW8 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4184710 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4184710 /var/tmp/spdk-raid.sock 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 4184710 ']' 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:44.453 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:44.453 17:17:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:44.453 [2024-07-23 17:17:39.761912] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:23:44.453 [2024-07-23 17:17:39.761977] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4184710 ] 00:23:44.712 [2024-07-23 17:17:39.893261] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:44.712 [2024-07-23 17:17:39.945318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:44.712 [2024-07-23 17:17:40.019758] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:44.712 [2024-07-23 17:17:40.019791] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:45.279 17:17:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:45.279 17:17:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:45.279 17:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:45.279 17:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:45.539 BaseBdev1_malloc 00:23:45.539 17:17:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:45.798 true 00:23:45.798 17:17:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:46.057 [2024-07-23 17:17:41.409354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:46.057 [2024-07-23 17:17:41.409400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.057 [2024-07-23 17:17:41.409420] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c385c0 00:23:46.057 [2024-07-23 17:17:41.409432] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.057 [2024-07-23 17:17:41.411071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.057 [2024-07-23 17:17:41.411098] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:46.057 BaseBdev1 00:23:46.057 17:17:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:46.057 17:17:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:46.315 BaseBdev2_malloc 00:23:46.316 17:17:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:46.574 true 00:23:46.574 17:17:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:46.833 [2024-07-23 17:17:42.147929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:46.833 [2024-07-23 17:17:42.147974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.833 [2024-07-23 17:17:42.147995] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c32620 00:23:46.833 [2024-07-23 17:17:42.148008] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.833 [2024-07-23 17:17:42.149584] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.833 [2024-07-23 17:17:42.149610] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:46.833 BaseBdev2 00:23:46.833 17:17:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:46.833 17:17:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:47.092 BaseBdev3_malloc 00:23:47.092 17:17:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:47.351 true 00:23:47.351 17:17:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:47.609 [2024-07-23 17:17:42.875600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:47.609 [2024-07-23 17:17:42.875646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:47.609 [2024-07-23 17:17:42.875667] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c32c00 00:23:47.609 [2024-07-23 17:17:42.875680] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:47.609 [2024-07-23 17:17:42.877214] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:47.609 [2024-07-23 17:17:42.877242] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:47.609 BaseBdev3 00:23:47.609 17:17:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:47.609 17:17:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:47.869 BaseBdev4_malloc 00:23:47.869 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:48.128 true 00:23:48.128 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:48.388 [2024-07-23 17:17:43.618170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:48.388 [2024-07-23 17:17:43.618213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.388 [2024-07-23 17:17:43.618233] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c359c0 00:23:48.388 [2024-07-23 17:17:43.618245] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.388 [2024-07-23 17:17:43.619758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.388 [2024-07-23 17:17:43.619785] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:48.388 BaseBdev4 00:23:48.388 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:48.647 [2024-07-23 17:17:43.862861] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:48.647 [2024-07-23 17:17:43.864210] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:48.647 [2024-07-23 17:17:43.864273] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:48.647 [2024-07-23 17:17:43.864331] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:48.647 [2024-07-23 17:17:43.864555] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b20510 00:23:48.647 [2024-07-23 17:17:43.864566] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:48.647 [2024-07-23 17:17:43.864765] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a84980 00:23:48.647 [2024-07-23 17:17:43.864922] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b20510 00:23:48.647 [2024-07-23 17:17:43.864933] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b20510 00:23:48.647 [2024-07-23 17:17:43.865035] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.647 17:17:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.906 17:17:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.906 "name": "raid_bdev1", 00:23:48.906 "uuid": "6de6a6d8-1484-4c55-8084-1aee55eebf58", 00:23:48.906 "strip_size_kb": 64, 00:23:48.906 "state": "online", 00:23:48.906 "raid_level": "raid0", 00:23:48.906 "superblock": true, 00:23:48.906 "num_base_bdevs": 4, 00:23:48.906 "num_base_bdevs_discovered": 4, 00:23:48.906 "num_base_bdevs_operational": 4, 00:23:48.906 "base_bdevs_list": [ 00:23:48.906 { 00:23:48.906 "name": "BaseBdev1", 00:23:48.906 "uuid": "5ab3a3af-67eb-53ed-8020-48a4b7ed98cb", 00:23:48.906 "is_configured": true, 00:23:48.906 "data_offset": 2048, 00:23:48.906 "data_size": 63488 00:23:48.906 }, 00:23:48.906 { 00:23:48.906 "name": "BaseBdev2", 00:23:48.906 "uuid": "a12f8269-74ed-55b4-b051-2cc0f0ffd00b", 00:23:48.906 "is_configured": true, 00:23:48.906 "data_offset": 2048, 00:23:48.906 "data_size": 63488 00:23:48.906 }, 00:23:48.906 { 00:23:48.906 "name": "BaseBdev3", 00:23:48.906 "uuid": "d61d69d9-26d9-51ff-979f-0dad7128ca31", 00:23:48.906 "is_configured": true, 00:23:48.906 "data_offset": 2048, 00:23:48.906 "data_size": 63488 00:23:48.906 }, 00:23:48.906 { 00:23:48.906 "name": "BaseBdev4", 00:23:48.906 "uuid": "33523b2f-af71-5deb-a418-9e9989a45a6c", 00:23:48.906 "is_configured": true, 00:23:48.906 "data_offset": 2048, 00:23:48.906 "data_size": 63488 00:23:48.906 } 00:23:48.906 ] 00:23:48.906 }' 00:23:48.906 17:17:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.906 17:17:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:49.842 17:17:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:49.842 17:17:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:49.842 [2024-07-23 17:17:45.154538] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b3c200 00:23:50.780 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.040 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.300 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:51.300 "name": "raid_bdev1", 00:23:51.300 "uuid": "6de6a6d8-1484-4c55-8084-1aee55eebf58", 00:23:51.300 "strip_size_kb": 64, 00:23:51.300 "state": "online", 00:23:51.300 "raid_level": "raid0", 00:23:51.300 "superblock": true, 00:23:51.300 "num_base_bdevs": 4, 00:23:51.300 "num_base_bdevs_discovered": 4, 00:23:51.300 "num_base_bdevs_operational": 4, 00:23:51.300 "base_bdevs_list": [ 00:23:51.300 { 00:23:51.300 "name": "BaseBdev1", 00:23:51.300 "uuid": "5ab3a3af-67eb-53ed-8020-48a4b7ed98cb", 00:23:51.300 "is_configured": true, 00:23:51.300 "data_offset": 2048, 00:23:51.300 "data_size": 63488 00:23:51.300 }, 00:23:51.300 { 00:23:51.300 "name": "BaseBdev2", 00:23:51.300 "uuid": "a12f8269-74ed-55b4-b051-2cc0f0ffd00b", 00:23:51.300 "is_configured": true, 00:23:51.300 "data_offset": 2048, 00:23:51.300 "data_size": 63488 00:23:51.300 }, 00:23:51.300 { 00:23:51.300 "name": "BaseBdev3", 00:23:51.300 "uuid": "d61d69d9-26d9-51ff-979f-0dad7128ca31", 00:23:51.300 "is_configured": true, 00:23:51.300 "data_offset": 2048, 00:23:51.300 "data_size": 63488 00:23:51.300 }, 00:23:51.300 { 00:23:51.300 "name": "BaseBdev4", 00:23:51.300 "uuid": "33523b2f-af71-5deb-a418-9e9989a45a6c", 00:23:51.300 "is_configured": true, 00:23:51.300 "data_offset": 2048, 00:23:51.300 "data_size": 63488 00:23:51.300 } 00:23:51.300 ] 00:23:51.300 }' 00:23:51.300 17:17:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:51.300 17:17:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:52.238 [2024-07-23 17:17:47.586712] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:52.238 [2024-07-23 17:17:47.586752] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:52.238 [2024-07-23 17:17:47.589909] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:52.238 [2024-07-23 17:17:47.589952] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.238 [2024-07-23 17:17:47.589990] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:52.238 [2024-07-23 17:17:47.590001] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b20510 name raid_bdev1, state offline 00:23:52.238 0 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4184710 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 4184710 ']' 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 4184710 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:52.238 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4184710 00:23:52.497 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:52.497 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:52.497 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4184710' 00:23:52.497 killing process with pid 4184710 00:23:52.497 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 4184710 00:23:52.497 [2024-07-23 17:17:47.684915] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:52.497 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 4184710 00:23:52.497 [2024-07-23 17:17:47.720777] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FBeziHxNW8 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.41 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.41 != \0\.\0\0 ]] 00:23:52.757 00:23:52.757 real 0m8.262s 00:23:52.757 user 0m13.422s 00:23:52.757 sys 0m1.375s 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:52.757 17:17:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:52.757 ************************************ 00:23:52.757 END TEST raid_read_error_test 00:23:52.757 ************************************ 00:23:52.757 17:17:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:52.757 17:17:47 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:23:52.757 17:17:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:52.757 17:17:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:52.757 17:17:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:52.757 ************************************ 00:23:52.757 START TEST raid_write_error_test 00:23:52.757 ************************************ 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.uUF485YKS8 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=4185864 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 4185864 /var/tmp/spdk-raid.sock 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 4185864 ']' 00:23:52.757 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:52.758 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:52.758 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:52.758 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:52.758 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:52.758 17:17:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:52.758 [2024-07-23 17:17:48.118880] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:23:52.758 [2024-07-23 17:17:48.118959] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4185864 ] 00:23:53.017 [2024-07-23 17:17:48.254139] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:53.017 [2024-07-23 17:17:48.307932] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:53.017 [2024-07-23 17:17:48.377121] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:53.017 [2024-07-23 17:17:48.377168] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:53.955 17:17:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:53.955 17:17:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:53.955 17:17:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:53.955 17:17:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:53.955 BaseBdev1_malloc 00:23:53.955 17:17:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:54.213 true 00:23:54.213 17:17:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:54.472 [2024-07-23 17:17:49.768270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:54.472 [2024-07-23 17:17:49.768311] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.472 [2024-07-23 17:17:49.768330] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c85c0 00:23:54.472 [2024-07-23 17:17:49.768343] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.472 [2024-07-23 17:17:49.769828] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.472 [2024-07-23 17:17:49.769856] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:54.472 BaseBdev1 00:23:54.472 17:17:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:54.472 17:17:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:54.731 BaseBdev2_malloc 00:23:54.731 17:17:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:54.990 true 00:23:54.990 17:17:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:55.249 [2024-07-23 17:17:50.527033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:55.249 [2024-07-23 17:17:50.527083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.249 [2024-07-23 17:17:50.527104] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c2620 00:23:55.249 [2024-07-23 17:17:50.527117] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.249 [2024-07-23 17:17:50.528549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.249 [2024-07-23 17:17:50.528577] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:55.249 BaseBdev2 00:23:55.249 17:17:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:55.249 17:17:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:55.507 BaseBdev3_malloc 00:23:55.507 17:17:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:55.766 true 00:23:55.766 17:17:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:56.025 [2024-07-23 17:17:51.281624] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:56.025 [2024-07-23 17:17:51.281665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.025 [2024-07-23 17:17:51.281686] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c2c00 00:23:56.025 [2024-07-23 17:17:51.281698] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.025 [2024-07-23 17:17:51.283151] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.025 [2024-07-23 17:17:51.283179] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:56.025 BaseBdev3 00:23:56.025 17:17:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:56.025 17:17:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:56.284 BaseBdev4_malloc 00:23:56.284 17:17:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:56.543 true 00:23:56.543 17:17:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:56.802 [2024-07-23 17:17:52.032252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:56.802 [2024-07-23 17:17:52.032297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.802 [2024-07-23 17:17:52.032317] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c59c0 00:23:56.802 [2024-07-23 17:17:52.032330] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.802 [2024-07-23 17:17:52.033740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.802 [2024-07-23 17:17:52.033767] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:56.802 BaseBdev4 00:23:56.802 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:57.061 [2024-07-23 17:17:52.280955] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:57.061 [2024-07-23 17:17:52.282155] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:57.061 [2024-07-23 17:17:52.282217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:57.061 [2024-07-23 17:17:52.282277] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:57.061 [2024-07-23 17:17:52.282502] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x10b0510 00:23:57.061 [2024-07-23 17:17:52.282513] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:23:57.061 [2024-07-23 17:17:52.282700] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1014980 00:23:57.061 [2024-07-23 17:17:52.282845] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10b0510 00:23:57.061 [2024-07-23 17:17:52.282855] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10b0510 00:23:57.061 [2024-07-23 17:17:52.282962] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.061 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.320 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:57.320 "name": "raid_bdev1", 00:23:57.320 "uuid": "5e7c2958-22dc-4996-b346-cff7fe2c0ece", 00:23:57.320 "strip_size_kb": 64, 00:23:57.320 "state": "online", 00:23:57.320 "raid_level": "raid0", 00:23:57.320 "superblock": true, 00:23:57.320 "num_base_bdevs": 4, 00:23:57.320 "num_base_bdevs_discovered": 4, 00:23:57.320 "num_base_bdevs_operational": 4, 00:23:57.320 "base_bdevs_list": [ 00:23:57.320 { 00:23:57.320 "name": "BaseBdev1", 00:23:57.320 "uuid": "57bae558-89ac-5d81-a134-aebb2bab694c", 00:23:57.320 "is_configured": true, 00:23:57.320 "data_offset": 2048, 00:23:57.320 "data_size": 63488 00:23:57.320 }, 00:23:57.320 { 00:23:57.320 "name": "BaseBdev2", 00:23:57.320 "uuid": "06785233-d46e-5e69-b499-aaecdb566d98", 00:23:57.320 "is_configured": true, 00:23:57.320 "data_offset": 2048, 00:23:57.320 "data_size": 63488 00:23:57.320 }, 00:23:57.320 { 00:23:57.320 "name": "BaseBdev3", 00:23:57.320 "uuid": "3b2c50ea-552e-5213-86f5-af47e62cb9c4", 00:23:57.320 "is_configured": true, 00:23:57.320 "data_offset": 2048, 00:23:57.320 "data_size": 63488 00:23:57.320 }, 00:23:57.320 { 00:23:57.320 "name": "BaseBdev4", 00:23:57.320 "uuid": "0b2bffd5-310c-504d-bec0-d627a75d731f", 00:23:57.320 "is_configured": true, 00:23:57.320 "data_offset": 2048, 00:23:57.320 "data_size": 63488 00:23:57.320 } 00:23:57.320 ] 00:23:57.320 }' 00:23:57.320 17:17:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:57.320 17:17:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:57.888 17:17:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:57.888 17:17:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:58.146 [2024-07-23 17:17:53.428260] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10cc200 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.083 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.651 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.651 "name": "raid_bdev1", 00:23:59.651 "uuid": "5e7c2958-22dc-4996-b346-cff7fe2c0ece", 00:23:59.651 "strip_size_kb": 64, 00:23:59.651 "state": "online", 00:23:59.651 "raid_level": "raid0", 00:23:59.651 "superblock": true, 00:23:59.651 "num_base_bdevs": 4, 00:23:59.651 "num_base_bdevs_discovered": 4, 00:23:59.651 "num_base_bdevs_operational": 4, 00:23:59.651 "base_bdevs_list": [ 00:23:59.651 { 00:23:59.651 "name": "BaseBdev1", 00:23:59.651 "uuid": "57bae558-89ac-5d81-a134-aebb2bab694c", 00:23:59.651 "is_configured": true, 00:23:59.651 "data_offset": 2048, 00:23:59.651 "data_size": 63488 00:23:59.651 }, 00:23:59.651 { 00:23:59.651 "name": "BaseBdev2", 00:23:59.651 "uuid": "06785233-d46e-5e69-b499-aaecdb566d98", 00:23:59.651 "is_configured": true, 00:23:59.651 "data_offset": 2048, 00:23:59.651 "data_size": 63488 00:23:59.651 }, 00:23:59.651 { 00:23:59.651 "name": "BaseBdev3", 00:23:59.651 "uuid": "3b2c50ea-552e-5213-86f5-af47e62cb9c4", 00:23:59.651 "is_configured": true, 00:23:59.651 "data_offset": 2048, 00:23:59.651 "data_size": 63488 00:23:59.651 }, 00:23:59.651 { 00:23:59.651 "name": "BaseBdev4", 00:23:59.651 "uuid": "0b2bffd5-310c-504d-bec0-d627a75d731f", 00:23:59.651 "is_configured": true, 00:23:59.651 "data_offset": 2048, 00:23:59.651 "data_size": 63488 00:23:59.651 } 00:23:59.651 ] 00:23:59.651 }' 00:23:59.651 17:17:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.651 17:17:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:00.587 17:17:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:00.845 [2024-07-23 17:17:56.144199] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:00.845 [2024-07-23 17:17:56.144247] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:00.845 [2024-07-23 17:17:56.147413] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.845 [2024-07-23 17:17:56.147457] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:00.845 [2024-07-23 17:17:56.147495] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.845 [2024-07-23 17:17:56.147506] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b0510 name raid_bdev1, state offline 00:24:00.845 0 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 4185864 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 4185864 ']' 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 4185864 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4185864 00:24:00.845 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:00.846 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:00.846 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4185864' 00:24:00.846 killing process with pid 4185864 00:24:00.846 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 4185864 00:24:00.846 [2024-07-23 17:17:56.219660] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:00.846 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 4185864 00:24:00.846 [2024-07-23 17:17:56.251401] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.uUF485YKS8 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.37 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.37 != \0\.\0\0 ]] 00:24:01.142 00:24:01.142 real 0m8.436s 00:24:01.142 user 0m13.855s 00:24:01.142 sys 0m1.465s 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:01.142 17:17:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:01.142 ************************************ 00:24:01.142 END TEST raid_write_error_test 00:24:01.142 ************************************ 00:24:01.142 17:17:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:01.142 17:17:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:24:01.142 17:17:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:24:01.142 17:17:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:01.142 17:17:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:01.142 17:17:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:01.418 ************************************ 00:24:01.418 START TEST raid_state_function_test 00:24:01.418 ************************************ 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=4187060 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4187060' 00:24:01.418 Process raid pid: 4187060 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 4187060 /var/tmp/spdk-raid.sock 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 4187060 ']' 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:01.418 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:01.418 17:17:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:01.418 [2024-07-23 17:17:56.645528] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:24:01.418 [2024-07-23 17:17:56.645601] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:01.418 [2024-07-23 17:17:56.779262] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:01.418 [2024-07-23 17:17:56.833354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:01.677 [2024-07-23 17:17:56.895611] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:01.677 [2024-07-23 17:17:56.895648] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:02.245 17:17:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:02.245 17:17:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:24:02.245 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:02.504 [2024-07-23 17:17:57.669469] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:02.504 [2024-07-23 17:17:57.669508] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:02.504 [2024-07-23 17:17:57.669519] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:02.504 [2024-07-23 17:17:57.669535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:02.504 [2024-07-23 17:17:57.669544] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:02.504 [2024-07-23 17:17:57.669555] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:02.504 [2024-07-23 17:17:57.669563] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:24:02.504 [2024-07-23 17:17:57.669574] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:02.504 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.504 "name": "Existed_Raid", 00:24:02.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.504 "strip_size_kb": 64, 00:24:02.504 "state": "configuring", 00:24:02.504 "raid_level": "concat", 00:24:02.504 "superblock": false, 00:24:02.504 "num_base_bdevs": 4, 00:24:02.504 "num_base_bdevs_discovered": 0, 00:24:02.504 "num_base_bdevs_operational": 4, 00:24:02.504 "base_bdevs_list": [ 00:24:02.504 { 00:24:02.504 "name": "BaseBdev1", 00:24:02.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.504 "is_configured": false, 00:24:02.504 "data_offset": 0, 00:24:02.504 "data_size": 0 00:24:02.504 }, 00:24:02.505 { 00:24:02.505 "name": "BaseBdev2", 00:24:02.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.505 "is_configured": false, 00:24:02.505 "data_offset": 0, 00:24:02.505 "data_size": 0 00:24:02.505 }, 00:24:02.505 { 00:24:02.505 "name": "BaseBdev3", 00:24:02.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.505 "is_configured": false, 00:24:02.505 "data_offset": 0, 00:24:02.505 "data_size": 0 00:24:02.505 }, 00:24:02.505 { 00:24:02.505 "name": "BaseBdev4", 00:24:02.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.505 "is_configured": false, 00:24:02.505 "data_offset": 0, 00:24:02.505 "data_size": 0 00:24:02.505 } 00:24:02.505 ] 00:24:02.505 }' 00:24:02.505 17:17:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.505 17:17:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:03.440 17:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:03.699 [2024-07-23 17:17:58.916609] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:03.699 [2024-07-23 17:17:58.916637] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x136a430 name Existed_Raid, state configuring 00:24:03.699 17:17:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:03.699 [2024-07-23 17:17:59.097108] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:03.699 [2024-07-23 17:17:59.097136] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:03.699 [2024-07-23 17:17:59.097151] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:03.699 [2024-07-23 17:17:59.097162] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:03.699 [2024-07-23 17:17:59.097171] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:03.699 [2024-07-23 17:17:59.097182] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:03.699 [2024-07-23 17:17:59.097190] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:24:03.699 [2024-07-23 17:17:59.097201] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:24:03.699 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:03.958 [2024-07-23 17:17:59.279414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:03.958 BaseBdev1 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:03.958 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:04.217 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:04.476 [ 00:24:04.476 { 00:24:04.476 "name": "BaseBdev1", 00:24:04.476 "aliases": [ 00:24:04.476 "37a1c09d-44f1-49a7-8838-6750b101ef3e" 00:24:04.476 ], 00:24:04.476 "product_name": "Malloc disk", 00:24:04.476 "block_size": 512, 00:24:04.476 "num_blocks": 65536, 00:24:04.476 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:04.476 "assigned_rate_limits": { 00:24:04.476 "rw_ios_per_sec": 0, 00:24:04.476 "rw_mbytes_per_sec": 0, 00:24:04.476 "r_mbytes_per_sec": 0, 00:24:04.476 "w_mbytes_per_sec": 0 00:24:04.476 }, 00:24:04.476 "claimed": true, 00:24:04.476 "claim_type": "exclusive_write", 00:24:04.476 "zoned": false, 00:24:04.476 "supported_io_types": { 00:24:04.476 "read": true, 00:24:04.476 "write": true, 00:24:04.476 "unmap": true, 00:24:04.476 "flush": true, 00:24:04.476 "reset": true, 00:24:04.476 "nvme_admin": false, 00:24:04.476 "nvme_io": false, 00:24:04.476 "nvme_io_md": false, 00:24:04.476 "write_zeroes": true, 00:24:04.476 "zcopy": true, 00:24:04.476 "get_zone_info": false, 00:24:04.476 "zone_management": false, 00:24:04.476 "zone_append": false, 00:24:04.476 "compare": false, 00:24:04.476 "compare_and_write": false, 00:24:04.476 "abort": true, 00:24:04.476 "seek_hole": false, 00:24:04.476 "seek_data": false, 00:24:04.476 "copy": true, 00:24:04.476 "nvme_iov_md": false 00:24:04.476 }, 00:24:04.476 "memory_domains": [ 00:24:04.476 { 00:24:04.476 "dma_device_id": "system", 00:24:04.476 "dma_device_type": 1 00:24:04.476 }, 00:24:04.476 { 00:24:04.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:04.476 "dma_device_type": 2 00:24:04.476 } 00:24:04.476 ], 00:24:04.476 "driver_specific": {} 00:24:04.476 } 00:24:04.476 ] 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.476 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:04.736 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.736 "name": "Existed_Raid", 00:24:04.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.736 "strip_size_kb": 64, 00:24:04.736 "state": "configuring", 00:24:04.736 "raid_level": "concat", 00:24:04.736 "superblock": false, 00:24:04.736 "num_base_bdevs": 4, 00:24:04.736 "num_base_bdevs_discovered": 1, 00:24:04.736 "num_base_bdevs_operational": 4, 00:24:04.736 "base_bdevs_list": [ 00:24:04.736 { 00:24:04.736 "name": "BaseBdev1", 00:24:04.736 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:04.736 "is_configured": true, 00:24:04.736 "data_offset": 0, 00:24:04.736 "data_size": 65536 00:24:04.736 }, 00:24:04.736 { 00:24:04.736 "name": "BaseBdev2", 00:24:04.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.736 "is_configured": false, 00:24:04.736 "data_offset": 0, 00:24:04.736 "data_size": 0 00:24:04.736 }, 00:24:04.736 { 00:24:04.736 "name": "BaseBdev3", 00:24:04.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.736 "is_configured": false, 00:24:04.736 "data_offset": 0, 00:24:04.736 "data_size": 0 00:24:04.736 }, 00:24:04.736 { 00:24:04.736 "name": "BaseBdev4", 00:24:04.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.736 "is_configured": false, 00:24:04.736 "data_offset": 0, 00:24:04.736 "data_size": 0 00:24:04.736 } 00:24:04.736 ] 00:24:04.736 }' 00:24:04.736 17:17:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.736 17:17:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:05.303 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:05.303 [2024-07-23 17:18:00.683152] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:05.303 [2024-07-23 17:18:00.683197] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1369d60 name Existed_Raid, state configuring 00:24:05.303 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:05.562 [2024-07-23 17:18:00.927830] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:05.562 [2024-07-23 17:18:00.929246] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:05.562 [2024-07-23 17:18:00.929280] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:05.562 [2024-07-23 17:18:00.929290] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:05.562 [2024-07-23 17:18:00.929302] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:05.562 [2024-07-23 17:18:00.929311] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:24:05.562 [2024-07-23 17:18:00.929322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.562 17:18:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:05.821 17:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.821 "name": "Existed_Raid", 00:24:05.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.821 "strip_size_kb": 64, 00:24:05.821 "state": "configuring", 00:24:05.821 "raid_level": "concat", 00:24:05.821 "superblock": false, 00:24:05.821 "num_base_bdevs": 4, 00:24:05.821 "num_base_bdevs_discovered": 1, 00:24:05.821 "num_base_bdevs_operational": 4, 00:24:05.821 "base_bdevs_list": [ 00:24:05.821 { 00:24:05.821 "name": "BaseBdev1", 00:24:05.821 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:05.821 "is_configured": true, 00:24:05.821 "data_offset": 0, 00:24:05.821 "data_size": 65536 00:24:05.821 }, 00:24:05.821 { 00:24:05.821 "name": "BaseBdev2", 00:24:05.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.821 "is_configured": false, 00:24:05.821 "data_offset": 0, 00:24:05.821 "data_size": 0 00:24:05.822 }, 00:24:05.822 { 00:24:05.822 "name": "BaseBdev3", 00:24:05.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.822 "is_configured": false, 00:24:05.822 "data_offset": 0, 00:24:05.822 "data_size": 0 00:24:05.822 }, 00:24:05.822 { 00:24:05.822 "name": "BaseBdev4", 00:24:05.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.822 "is_configured": false, 00:24:05.822 "data_offset": 0, 00:24:05.822 "data_size": 0 00:24:05.822 } 00:24:05.822 ] 00:24:05.822 }' 00:24:05.822 17:18:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.822 17:18:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:06.759 17:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:07.327 [2024-07-23 17:18:02.656922] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:07.327 BaseBdev2 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:07.327 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:07.585 17:18:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:08.152 [ 00:24:08.152 { 00:24:08.152 "name": "BaseBdev2", 00:24:08.152 "aliases": [ 00:24:08.152 "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f" 00:24:08.152 ], 00:24:08.152 "product_name": "Malloc disk", 00:24:08.152 "block_size": 512, 00:24:08.152 "num_blocks": 65536, 00:24:08.152 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:08.152 "assigned_rate_limits": { 00:24:08.152 "rw_ios_per_sec": 0, 00:24:08.152 "rw_mbytes_per_sec": 0, 00:24:08.152 "r_mbytes_per_sec": 0, 00:24:08.152 "w_mbytes_per_sec": 0 00:24:08.152 }, 00:24:08.152 "claimed": true, 00:24:08.152 "claim_type": "exclusive_write", 00:24:08.152 "zoned": false, 00:24:08.152 "supported_io_types": { 00:24:08.152 "read": true, 00:24:08.152 "write": true, 00:24:08.152 "unmap": true, 00:24:08.152 "flush": true, 00:24:08.152 "reset": true, 00:24:08.152 "nvme_admin": false, 00:24:08.152 "nvme_io": false, 00:24:08.152 "nvme_io_md": false, 00:24:08.152 "write_zeroes": true, 00:24:08.152 "zcopy": true, 00:24:08.152 "get_zone_info": false, 00:24:08.152 "zone_management": false, 00:24:08.152 "zone_append": false, 00:24:08.152 "compare": false, 00:24:08.152 "compare_and_write": false, 00:24:08.152 "abort": true, 00:24:08.152 "seek_hole": false, 00:24:08.152 "seek_data": false, 00:24:08.152 "copy": true, 00:24:08.152 "nvme_iov_md": false 00:24:08.152 }, 00:24:08.152 "memory_domains": [ 00:24:08.152 { 00:24:08.152 "dma_device_id": "system", 00:24:08.152 "dma_device_type": 1 00:24:08.152 }, 00:24:08.152 { 00:24:08.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.152 "dma_device_type": 2 00:24:08.152 } 00:24:08.152 ], 00:24:08.152 "driver_specific": {} 00:24:08.152 } 00:24:08.152 ] 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.152 17:18:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:08.720 17:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:08.720 "name": "Existed_Raid", 00:24:08.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.720 "strip_size_kb": 64, 00:24:08.720 "state": "configuring", 00:24:08.720 "raid_level": "concat", 00:24:08.720 "superblock": false, 00:24:08.720 "num_base_bdevs": 4, 00:24:08.720 "num_base_bdevs_discovered": 2, 00:24:08.720 "num_base_bdevs_operational": 4, 00:24:08.720 "base_bdevs_list": [ 00:24:08.720 { 00:24:08.720 "name": "BaseBdev1", 00:24:08.720 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:08.720 "is_configured": true, 00:24:08.720 "data_offset": 0, 00:24:08.720 "data_size": 65536 00:24:08.720 }, 00:24:08.720 { 00:24:08.720 "name": "BaseBdev2", 00:24:08.720 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:08.720 "is_configured": true, 00:24:08.720 "data_offset": 0, 00:24:08.720 "data_size": 65536 00:24:08.720 }, 00:24:08.720 { 00:24:08.720 "name": "BaseBdev3", 00:24:08.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.720 "is_configured": false, 00:24:08.720 "data_offset": 0, 00:24:08.720 "data_size": 0 00:24:08.720 }, 00:24:08.720 { 00:24:08.720 "name": "BaseBdev4", 00:24:08.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.720 "is_configured": false, 00:24:08.720 "data_offset": 0, 00:24:08.720 "data_size": 0 00:24:08.720 } 00:24:08.720 ] 00:24:08.720 }' 00:24:08.720 17:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:08.720 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:09.287 17:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:09.546 [2024-07-23 17:18:04.880075] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:09.546 BaseBdev3 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:09.546 17:18:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:09.804 17:18:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:10.062 [ 00:24:10.062 { 00:24:10.062 "name": "BaseBdev3", 00:24:10.062 "aliases": [ 00:24:10.062 "de69f98c-ede5-4c9c-b845-29cb0d34e6b5" 00:24:10.062 ], 00:24:10.062 "product_name": "Malloc disk", 00:24:10.062 "block_size": 512, 00:24:10.062 "num_blocks": 65536, 00:24:10.062 "uuid": "de69f98c-ede5-4c9c-b845-29cb0d34e6b5", 00:24:10.062 "assigned_rate_limits": { 00:24:10.062 "rw_ios_per_sec": 0, 00:24:10.062 "rw_mbytes_per_sec": 0, 00:24:10.062 "r_mbytes_per_sec": 0, 00:24:10.062 "w_mbytes_per_sec": 0 00:24:10.062 }, 00:24:10.062 "claimed": true, 00:24:10.062 "claim_type": "exclusive_write", 00:24:10.062 "zoned": false, 00:24:10.062 "supported_io_types": { 00:24:10.062 "read": true, 00:24:10.062 "write": true, 00:24:10.062 "unmap": true, 00:24:10.062 "flush": true, 00:24:10.062 "reset": true, 00:24:10.062 "nvme_admin": false, 00:24:10.062 "nvme_io": false, 00:24:10.062 "nvme_io_md": false, 00:24:10.062 "write_zeroes": true, 00:24:10.062 "zcopy": true, 00:24:10.062 "get_zone_info": false, 00:24:10.062 "zone_management": false, 00:24:10.062 "zone_append": false, 00:24:10.062 "compare": false, 00:24:10.062 "compare_and_write": false, 00:24:10.062 "abort": true, 00:24:10.062 "seek_hole": false, 00:24:10.062 "seek_data": false, 00:24:10.062 "copy": true, 00:24:10.062 "nvme_iov_md": false 00:24:10.062 }, 00:24:10.062 "memory_domains": [ 00:24:10.062 { 00:24:10.062 "dma_device_id": "system", 00:24:10.062 "dma_device_type": 1 00:24:10.062 }, 00:24:10.062 { 00:24:10.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:10.062 "dma_device_type": 2 00:24:10.062 } 00:24:10.062 ], 00:24:10.062 "driver_specific": {} 00:24:10.062 } 00:24:10.062 ] 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:10.062 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:10.063 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:10.063 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.321 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:10.321 "name": "Existed_Raid", 00:24:10.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.321 "strip_size_kb": 64, 00:24:10.321 "state": "configuring", 00:24:10.321 "raid_level": "concat", 00:24:10.321 "superblock": false, 00:24:10.321 "num_base_bdevs": 4, 00:24:10.321 "num_base_bdevs_discovered": 3, 00:24:10.321 "num_base_bdevs_operational": 4, 00:24:10.321 "base_bdevs_list": [ 00:24:10.321 { 00:24:10.321 "name": "BaseBdev1", 00:24:10.321 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:10.321 "is_configured": true, 00:24:10.321 "data_offset": 0, 00:24:10.321 "data_size": 65536 00:24:10.321 }, 00:24:10.321 { 00:24:10.321 "name": "BaseBdev2", 00:24:10.321 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:10.321 "is_configured": true, 00:24:10.321 "data_offset": 0, 00:24:10.321 "data_size": 65536 00:24:10.321 }, 00:24:10.321 { 00:24:10.321 "name": "BaseBdev3", 00:24:10.321 "uuid": "de69f98c-ede5-4c9c-b845-29cb0d34e6b5", 00:24:10.321 "is_configured": true, 00:24:10.321 "data_offset": 0, 00:24:10.321 "data_size": 65536 00:24:10.321 }, 00:24:10.321 { 00:24:10.321 "name": "BaseBdev4", 00:24:10.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.321 "is_configured": false, 00:24:10.321 "data_offset": 0, 00:24:10.321 "data_size": 0 00:24:10.321 } 00:24:10.321 ] 00:24:10.321 }' 00:24:10.321 17:18:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:10.321 17:18:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:11.257 17:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:24:11.257 [2024-07-23 17:18:06.553676] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:11.257 [2024-07-23 17:18:06.553716] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13699b0 00:24:11.257 [2024-07-23 17:18:06.553724] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:24:11.257 [2024-07-23 17:18:06.553946] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1417260 00:24:11.257 [2024-07-23 17:18:06.554084] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13699b0 00:24:11.257 [2024-07-23 17:18:06.554094] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13699b0 00:24:11.257 [2024-07-23 17:18:06.554273] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.257 BaseBdev4 00:24:11.257 17:18:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:24:11.257 17:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:24:11.257 17:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:11.257 17:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:11.257 17:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:11.258 17:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:11.258 17:18:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:11.825 17:18:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:24:12.392 [ 00:24:12.392 { 00:24:12.392 "name": "BaseBdev4", 00:24:12.392 "aliases": [ 00:24:12.392 "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8" 00:24:12.392 ], 00:24:12.392 "product_name": "Malloc disk", 00:24:12.392 "block_size": 512, 00:24:12.392 "num_blocks": 65536, 00:24:12.392 "uuid": "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8", 00:24:12.392 "assigned_rate_limits": { 00:24:12.392 "rw_ios_per_sec": 0, 00:24:12.392 "rw_mbytes_per_sec": 0, 00:24:12.392 "r_mbytes_per_sec": 0, 00:24:12.392 "w_mbytes_per_sec": 0 00:24:12.392 }, 00:24:12.392 "claimed": true, 00:24:12.392 "claim_type": "exclusive_write", 00:24:12.392 "zoned": false, 00:24:12.392 "supported_io_types": { 00:24:12.392 "read": true, 00:24:12.392 "write": true, 00:24:12.392 "unmap": true, 00:24:12.392 "flush": true, 00:24:12.392 "reset": true, 00:24:12.392 "nvme_admin": false, 00:24:12.392 "nvme_io": false, 00:24:12.392 "nvme_io_md": false, 00:24:12.392 "write_zeroes": true, 00:24:12.392 "zcopy": true, 00:24:12.392 "get_zone_info": false, 00:24:12.392 "zone_management": false, 00:24:12.392 "zone_append": false, 00:24:12.392 "compare": false, 00:24:12.392 "compare_and_write": false, 00:24:12.392 "abort": true, 00:24:12.392 "seek_hole": false, 00:24:12.392 "seek_data": false, 00:24:12.392 "copy": true, 00:24:12.392 "nvme_iov_md": false 00:24:12.392 }, 00:24:12.392 "memory_domains": [ 00:24:12.392 { 00:24:12.392 "dma_device_id": "system", 00:24:12.392 "dma_device_type": 1 00:24:12.392 }, 00:24:12.392 { 00:24:12.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:12.392 "dma_device_type": 2 00:24:12.392 } 00:24:12.392 ], 00:24:12.392 "driver_specific": {} 00:24:12.392 } 00:24:12.392 ] 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:12.392 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.392 "name": "Existed_Raid", 00:24:12.392 "uuid": "ba3f9efd-f5d1-4dce-895b-905f1163c12e", 00:24:12.392 "strip_size_kb": 64, 00:24:12.392 "state": "online", 00:24:12.393 "raid_level": "concat", 00:24:12.393 "superblock": false, 00:24:12.393 "num_base_bdevs": 4, 00:24:12.393 "num_base_bdevs_discovered": 4, 00:24:12.393 "num_base_bdevs_operational": 4, 00:24:12.393 "base_bdevs_list": [ 00:24:12.393 { 00:24:12.393 "name": "BaseBdev1", 00:24:12.393 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:12.393 "is_configured": true, 00:24:12.393 "data_offset": 0, 00:24:12.393 "data_size": 65536 00:24:12.393 }, 00:24:12.393 { 00:24:12.393 "name": "BaseBdev2", 00:24:12.393 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:12.393 "is_configured": true, 00:24:12.393 "data_offset": 0, 00:24:12.393 "data_size": 65536 00:24:12.393 }, 00:24:12.393 { 00:24:12.393 "name": "BaseBdev3", 00:24:12.393 "uuid": "de69f98c-ede5-4c9c-b845-29cb0d34e6b5", 00:24:12.393 "is_configured": true, 00:24:12.393 "data_offset": 0, 00:24:12.393 "data_size": 65536 00:24:12.393 }, 00:24:12.393 { 00:24:12.393 "name": "BaseBdev4", 00:24:12.393 "uuid": "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8", 00:24:12.393 "is_configured": true, 00:24:12.393 "data_offset": 0, 00:24:12.393 "data_size": 65536 00:24:12.393 } 00:24:12.393 ] 00:24:12.393 }' 00:24:12.393 17:18:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.393 17:18:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:12.959 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:13.219 [2024-07-23 17:18:08.571358] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:13.219 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:13.219 "name": "Existed_Raid", 00:24:13.219 "aliases": [ 00:24:13.219 "ba3f9efd-f5d1-4dce-895b-905f1163c12e" 00:24:13.219 ], 00:24:13.219 "product_name": "Raid Volume", 00:24:13.219 "block_size": 512, 00:24:13.219 "num_blocks": 262144, 00:24:13.219 "uuid": "ba3f9efd-f5d1-4dce-895b-905f1163c12e", 00:24:13.219 "assigned_rate_limits": { 00:24:13.219 "rw_ios_per_sec": 0, 00:24:13.219 "rw_mbytes_per_sec": 0, 00:24:13.219 "r_mbytes_per_sec": 0, 00:24:13.219 "w_mbytes_per_sec": 0 00:24:13.219 }, 00:24:13.219 "claimed": false, 00:24:13.219 "zoned": false, 00:24:13.219 "supported_io_types": { 00:24:13.219 "read": true, 00:24:13.219 "write": true, 00:24:13.219 "unmap": true, 00:24:13.219 "flush": true, 00:24:13.219 "reset": true, 00:24:13.219 "nvme_admin": false, 00:24:13.219 "nvme_io": false, 00:24:13.219 "nvme_io_md": false, 00:24:13.219 "write_zeroes": true, 00:24:13.219 "zcopy": false, 00:24:13.219 "get_zone_info": false, 00:24:13.219 "zone_management": false, 00:24:13.219 "zone_append": false, 00:24:13.219 "compare": false, 00:24:13.219 "compare_and_write": false, 00:24:13.219 "abort": false, 00:24:13.219 "seek_hole": false, 00:24:13.219 "seek_data": false, 00:24:13.219 "copy": false, 00:24:13.219 "nvme_iov_md": false 00:24:13.219 }, 00:24:13.219 "memory_domains": [ 00:24:13.219 { 00:24:13.219 "dma_device_id": "system", 00:24:13.219 "dma_device_type": 1 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.219 "dma_device_type": 2 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "system", 00:24:13.219 "dma_device_type": 1 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.219 "dma_device_type": 2 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "system", 00:24:13.219 "dma_device_type": 1 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.219 "dma_device_type": 2 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "system", 00:24:13.219 "dma_device_type": 1 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.219 "dma_device_type": 2 00:24:13.219 } 00:24:13.219 ], 00:24:13.219 "driver_specific": { 00:24:13.219 "raid": { 00:24:13.219 "uuid": "ba3f9efd-f5d1-4dce-895b-905f1163c12e", 00:24:13.219 "strip_size_kb": 64, 00:24:13.219 "state": "online", 00:24:13.219 "raid_level": "concat", 00:24:13.219 "superblock": false, 00:24:13.219 "num_base_bdevs": 4, 00:24:13.219 "num_base_bdevs_discovered": 4, 00:24:13.219 "num_base_bdevs_operational": 4, 00:24:13.219 "base_bdevs_list": [ 00:24:13.219 { 00:24:13.219 "name": "BaseBdev1", 00:24:13.219 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:13.219 "is_configured": true, 00:24:13.219 "data_offset": 0, 00:24:13.219 "data_size": 65536 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "name": "BaseBdev2", 00:24:13.219 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:13.219 "is_configured": true, 00:24:13.219 "data_offset": 0, 00:24:13.219 "data_size": 65536 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "name": "BaseBdev3", 00:24:13.219 "uuid": "de69f98c-ede5-4c9c-b845-29cb0d34e6b5", 00:24:13.219 "is_configured": true, 00:24:13.219 "data_offset": 0, 00:24:13.219 "data_size": 65536 00:24:13.219 }, 00:24:13.219 { 00:24:13.219 "name": "BaseBdev4", 00:24:13.219 "uuid": "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8", 00:24:13.219 "is_configured": true, 00:24:13.219 "data_offset": 0, 00:24:13.219 "data_size": 65536 00:24:13.219 } 00:24:13.219 ] 00:24:13.219 } 00:24:13.219 } 00:24:13.219 }' 00:24:13.219 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:13.219 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:13.219 BaseBdev2 00:24:13.219 BaseBdev3 00:24:13.219 BaseBdev4' 00:24:13.219 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:13.219 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:13.219 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:13.479 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:13.479 "name": "BaseBdev1", 00:24:13.479 "aliases": [ 00:24:13.479 "37a1c09d-44f1-49a7-8838-6750b101ef3e" 00:24:13.479 ], 00:24:13.479 "product_name": "Malloc disk", 00:24:13.479 "block_size": 512, 00:24:13.479 "num_blocks": 65536, 00:24:13.479 "uuid": "37a1c09d-44f1-49a7-8838-6750b101ef3e", 00:24:13.479 "assigned_rate_limits": { 00:24:13.479 "rw_ios_per_sec": 0, 00:24:13.479 "rw_mbytes_per_sec": 0, 00:24:13.479 "r_mbytes_per_sec": 0, 00:24:13.479 "w_mbytes_per_sec": 0 00:24:13.479 }, 00:24:13.479 "claimed": true, 00:24:13.479 "claim_type": "exclusive_write", 00:24:13.479 "zoned": false, 00:24:13.479 "supported_io_types": { 00:24:13.479 "read": true, 00:24:13.479 "write": true, 00:24:13.479 "unmap": true, 00:24:13.479 "flush": true, 00:24:13.479 "reset": true, 00:24:13.479 "nvme_admin": false, 00:24:13.479 "nvme_io": false, 00:24:13.479 "nvme_io_md": false, 00:24:13.479 "write_zeroes": true, 00:24:13.479 "zcopy": true, 00:24:13.479 "get_zone_info": false, 00:24:13.479 "zone_management": false, 00:24:13.479 "zone_append": false, 00:24:13.479 "compare": false, 00:24:13.479 "compare_and_write": false, 00:24:13.479 "abort": true, 00:24:13.479 "seek_hole": false, 00:24:13.479 "seek_data": false, 00:24:13.479 "copy": true, 00:24:13.738 "nvme_iov_md": false 00:24:13.738 }, 00:24:13.738 "memory_domains": [ 00:24:13.738 { 00:24:13.738 "dma_device_id": "system", 00:24:13.738 "dma_device_type": 1 00:24:13.738 }, 00:24:13.738 { 00:24:13.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:13.738 "dma_device_type": 2 00:24:13.738 } 00:24:13.738 ], 00:24:13.738 "driver_specific": {} 00:24:13.738 }' 00:24:13.738 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:13.738 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:13.738 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:13.738 17:18:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:13.738 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:13.738 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:13.738 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:13.738 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:13.738 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:13.996 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:13.996 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:13.996 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:13.996 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:13.996 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:13.996 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:14.254 "name": "BaseBdev2", 00:24:14.254 "aliases": [ 00:24:14.254 "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f" 00:24:14.254 ], 00:24:14.254 "product_name": "Malloc disk", 00:24:14.254 "block_size": 512, 00:24:14.254 "num_blocks": 65536, 00:24:14.254 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:14.254 "assigned_rate_limits": { 00:24:14.254 "rw_ios_per_sec": 0, 00:24:14.254 "rw_mbytes_per_sec": 0, 00:24:14.254 "r_mbytes_per_sec": 0, 00:24:14.254 "w_mbytes_per_sec": 0 00:24:14.254 }, 00:24:14.254 "claimed": true, 00:24:14.254 "claim_type": "exclusive_write", 00:24:14.254 "zoned": false, 00:24:14.254 "supported_io_types": { 00:24:14.254 "read": true, 00:24:14.254 "write": true, 00:24:14.254 "unmap": true, 00:24:14.254 "flush": true, 00:24:14.254 "reset": true, 00:24:14.254 "nvme_admin": false, 00:24:14.254 "nvme_io": false, 00:24:14.254 "nvme_io_md": false, 00:24:14.254 "write_zeroes": true, 00:24:14.254 "zcopy": true, 00:24:14.254 "get_zone_info": false, 00:24:14.254 "zone_management": false, 00:24:14.254 "zone_append": false, 00:24:14.254 "compare": false, 00:24:14.254 "compare_and_write": false, 00:24:14.254 "abort": true, 00:24:14.254 "seek_hole": false, 00:24:14.254 "seek_data": false, 00:24:14.254 "copy": true, 00:24:14.254 "nvme_iov_md": false 00:24:14.254 }, 00:24:14.254 "memory_domains": [ 00:24:14.254 { 00:24:14.254 "dma_device_id": "system", 00:24:14.254 "dma_device_type": 1 00:24:14.254 }, 00:24:14.254 { 00:24:14.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.254 "dma_device_type": 2 00:24:14.254 } 00:24:14.254 ], 00:24:14.254 "driver_specific": {} 00:24:14.254 }' 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:14.254 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:14.513 17:18:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:14.772 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:14.772 "name": "BaseBdev3", 00:24:14.772 "aliases": [ 00:24:14.772 "de69f98c-ede5-4c9c-b845-29cb0d34e6b5" 00:24:14.772 ], 00:24:14.772 "product_name": "Malloc disk", 00:24:14.772 "block_size": 512, 00:24:14.772 "num_blocks": 65536, 00:24:14.772 "uuid": "de69f98c-ede5-4c9c-b845-29cb0d34e6b5", 00:24:14.772 "assigned_rate_limits": { 00:24:14.772 "rw_ios_per_sec": 0, 00:24:14.772 "rw_mbytes_per_sec": 0, 00:24:14.772 "r_mbytes_per_sec": 0, 00:24:14.772 "w_mbytes_per_sec": 0 00:24:14.772 }, 00:24:14.772 "claimed": true, 00:24:14.772 "claim_type": "exclusive_write", 00:24:14.772 "zoned": false, 00:24:14.772 "supported_io_types": { 00:24:14.772 "read": true, 00:24:14.772 "write": true, 00:24:14.772 "unmap": true, 00:24:14.772 "flush": true, 00:24:14.772 "reset": true, 00:24:14.772 "nvme_admin": false, 00:24:14.772 "nvme_io": false, 00:24:14.772 "nvme_io_md": false, 00:24:14.772 "write_zeroes": true, 00:24:14.772 "zcopy": true, 00:24:14.772 "get_zone_info": false, 00:24:14.772 "zone_management": false, 00:24:14.772 "zone_append": false, 00:24:14.772 "compare": false, 00:24:14.772 "compare_and_write": false, 00:24:14.772 "abort": true, 00:24:14.772 "seek_hole": false, 00:24:14.772 "seek_data": false, 00:24:14.772 "copy": true, 00:24:14.772 "nvme_iov_md": false 00:24:14.772 }, 00:24:14.772 "memory_domains": [ 00:24:14.772 { 00:24:14.772 "dma_device_id": "system", 00:24:14.772 "dma_device_type": 1 00:24:14.772 }, 00:24:14.772 { 00:24:14.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:14.772 "dma_device_type": 2 00:24:14.772 } 00:24:14.772 ], 00:24:14.772 "driver_specific": {} 00:24:14.772 }' 00:24:14.772 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:14.772 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:15.029 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.287 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.287 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:15.287 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:15.287 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:24:15.287 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:15.546 "name": "BaseBdev4", 00:24:15.546 "aliases": [ 00:24:15.546 "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8" 00:24:15.546 ], 00:24:15.546 "product_name": "Malloc disk", 00:24:15.546 "block_size": 512, 00:24:15.546 "num_blocks": 65536, 00:24:15.546 "uuid": "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8", 00:24:15.546 "assigned_rate_limits": { 00:24:15.546 "rw_ios_per_sec": 0, 00:24:15.546 "rw_mbytes_per_sec": 0, 00:24:15.546 "r_mbytes_per_sec": 0, 00:24:15.546 "w_mbytes_per_sec": 0 00:24:15.546 }, 00:24:15.546 "claimed": true, 00:24:15.546 "claim_type": "exclusive_write", 00:24:15.546 "zoned": false, 00:24:15.546 "supported_io_types": { 00:24:15.546 "read": true, 00:24:15.546 "write": true, 00:24:15.546 "unmap": true, 00:24:15.546 "flush": true, 00:24:15.546 "reset": true, 00:24:15.546 "nvme_admin": false, 00:24:15.546 "nvme_io": false, 00:24:15.546 "nvme_io_md": false, 00:24:15.546 "write_zeroes": true, 00:24:15.546 "zcopy": true, 00:24:15.546 "get_zone_info": false, 00:24:15.546 "zone_management": false, 00:24:15.546 "zone_append": false, 00:24:15.546 "compare": false, 00:24:15.546 "compare_and_write": false, 00:24:15.546 "abort": true, 00:24:15.546 "seek_hole": false, 00:24:15.546 "seek_data": false, 00:24:15.546 "copy": true, 00:24:15.546 "nvme_iov_md": false 00:24:15.546 }, 00:24:15.546 "memory_domains": [ 00:24:15.546 { 00:24:15.546 "dma_device_id": "system", 00:24:15.546 "dma_device_type": 1 00:24:15.546 }, 00:24:15.546 { 00:24:15.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:15.546 "dma_device_type": 2 00:24:15.546 } 00:24:15.546 ], 00:24:15.546 "driver_specific": {} 00:24:15.546 }' 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:15.546 17:18:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.805 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:15.805 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:15.805 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.805 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:15.805 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:15.805 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:16.372 [2024-07-23 17:18:11.647232] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:16.372 [2024-07-23 17:18:11.647262] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:16.372 [2024-07-23 17:18:11.647321] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.372 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:16.631 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.631 "name": "Existed_Raid", 00:24:16.631 "uuid": "ba3f9efd-f5d1-4dce-895b-905f1163c12e", 00:24:16.631 "strip_size_kb": 64, 00:24:16.631 "state": "offline", 00:24:16.631 "raid_level": "concat", 00:24:16.631 "superblock": false, 00:24:16.631 "num_base_bdevs": 4, 00:24:16.631 "num_base_bdevs_discovered": 3, 00:24:16.631 "num_base_bdevs_operational": 3, 00:24:16.631 "base_bdevs_list": [ 00:24:16.631 { 00:24:16.631 "name": null, 00:24:16.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.631 "is_configured": false, 00:24:16.631 "data_offset": 0, 00:24:16.631 "data_size": 65536 00:24:16.631 }, 00:24:16.631 { 00:24:16.631 "name": "BaseBdev2", 00:24:16.631 "uuid": "af3a7e53-dd2d-48d7-af69-f87e3a7a4b5f", 00:24:16.631 "is_configured": true, 00:24:16.631 "data_offset": 0, 00:24:16.631 "data_size": 65536 00:24:16.631 }, 00:24:16.631 { 00:24:16.631 "name": "BaseBdev3", 00:24:16.631 "uuid": "de69f98c-ede5-4c9c-b845-29cb0d34e6b5", 00:24:16.631 "is_configured": true, 00:24:16.631 "data_offset": 0, 00:24:16.631 "data_size": 65536 00:24:16.631 }, 00:24:16.631 { 00:24:16.631 "name": "BaseBdev4", 00:24:16.631 "uuid": "7549974b-8cf3-4fbc-acb7-0b68fb99f1a8", 00:24:16.631 "is_configured": true, 00:24:16.631 "data_offset": 0, 00:24:16.631 "data_size": 65536 00:24:16.631 } 00:24:16.631 ] 00:24:16.631 }' 00:24:16.631 17:18:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.631 17:18:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:17.198 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:17.198 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:17.198 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.198 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:17.457 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:17.457 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:17.457 17:18:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:18.025 [2024-07-23 17:18:13.269073] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:18.025 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:18.025 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:18.025 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.025 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:18.326 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:18.326 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:18.326 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:24:18.591 [2024-07-23 17:18:13.782875] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:18.591 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:18.591 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:18.591 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.591 17:18:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:18.850 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:18.850 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:18.850 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:24:19.109 [2024-07-23 17:18:14.286688] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:24:19.109 [2024-07-23 17:18:14.286731] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13699b0 name Existed_Raid, state offline 00:24:19.109 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:19.109 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:19.109 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.109 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:19.368 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:19.368 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:19.368 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:24:19.368 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:24:19.368 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:19.368 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:19.626 BaseBdev2 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:19.626 17:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:19.626 17:18:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:20.194 [ 00:24:20.194 { 00:24:20.194 "name": "BaseBdev2", 00:24:20.194 "aliases": [ 00:24:20.194 "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f" 00:24:20.194 ], 00:24:20.194 "product_name": "Malloc disk", 00:24:20.194 "block_size": 512, 00:24:20.194 "num_blocks": 65536, 00:24:20.194 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:20.194 "assigned_rate_limits": { 00:24:20.194 "rw_ios_per_sec": 0, 00:24:20.194 "rw_mbytes_per_sec": 0, 00:24:20.194 "r_mbytes_per_sec": 0, 00:24:20.194 "w_mbytes_per_sec": 0 00:24:20.194 }, 00:24:20.194 "claimed": false, 00:24:20.194 "zoned": false, 00:24:20.194 "supported_io_types": { 00:24:20.194 "read": true, 00:24:20.194 "write": true, 00:24:20.194 "unmap": true, 00:24:20.194 "flush": true, 00:24:20.194 "reset": true, 00:24:20.194 "nvme_admin": false, 00:24:20.194 "nvme_io": false, 00:24:20.194 "nvme_io_md": false, 00:24:20.194 "write_zeroes": true, 00:24:20.194 "zcopy": true, 00:24:20.194 "get_zone_info": false, 00:24:20.194 "zone_management": false, 00:24:20.194 "zone_append": false, 00:24:20.194 "compare": false, 00:24:20.194 "compare_and_write": false, 00:24:20.194 "abort": true, 00:24:20.194 "seek_hole": false, 00:24:20.194 "seek_data": false, 00:24:20.194 "copy": true, 00:24:20.194 "nvme_iov_md": false 00:24:20.194 }, 00:24:20.194 "memory_domains": [ 00:24:20.194 { 00:24:20.194 "dma_device_id": "system", 00:24:20.194 "dma_device_type": 1 00:24:20.194 }, 00:24:20.194 { 00:24:20.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.194 "dma_device_type": 2 00:24:20.194 } 00:24:20.194 ], 00:24:20.194 "driver_specific": {} 00:24:20.194 } 00:24:20.194 ] 00:24:20.194 17:18:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:20.194 17:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:20.194 17:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:20.194 17:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:20.762 BaseBdev3 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:20.762 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:21.330 17:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:21.899 [ 00:24:21.899 { 00:24:21.899 "name": "BaseBdev3", 00:24:21.899 "aliases": [ 00:24:21.899 "0ecd830a-0f6a-4064-864d-bdee2129821c" 00:24:21.899 ], 00:24:21.899 "product_name": "Malloc disk", 00:24:21.899 "block_size": 512, 00:24:21.899 "num_blocks": 65536, 00:24:21.899 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:21.899 "assigned_rate_limits": { 00:24:21.899 "rw_ios_per_sec": 0, 00:24:21.899 "rw_mbytes_per_sec": 0, 00:24:21.899 "r_mbytes_per_sec": 0, 00:24:21.899 "w_mbytes_per_sec": 0 00:24:21.899 }, 00:24:21.899 "claimed": false, 00:24:21.899 "zoned": false, 00:24:21.899 "supported_io_types": { 00:24:21.899 "read": true, 00:24:21.899 "write": true, 00:24:21.899 "unmap": true, 00:24:21.899 "flush": true, 00:24:21.899 "reset": true, 00:24:21.899 "nvme_admin": false, 00:24:21.899 "nvme_io": false, 00:24:21.899 "nvme_io_md": false, 00:24:21.899 "write_zeroes": true, 00:24:21.899 "zcopy": true, 00:24:21.899 "get_zone_info": false, 00:24:21.899 "zone_management": false, 00:24:21.899 "zone_append": false, 00:24:21.899 "compare": false, 00:24:21.899 "compare_and_write": false, 00:24:21.899 "abort": true, 00:24:21.899 "seek_hole": false, 00:24:21.899 "seek_data": false, 00:24:21.899 "copy": true, 00:24:21.899 "nvme_iov_md": false 00:24:21.899 }, 00:24:21.899 "memory_domains": [ 00:24:21.899 { 00:24:21.899 "dma_device_id": "system", 00:24:21.899 "dma_device_type": 1 00:24:21.899 }, 00:24:21.899 { 00:24:21.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:21.899 "dma_device_type": 2 00:24:21.899 } 00:24:21.899 ], 00:24:21.899 "driver_specific": {} 00:24:21.899 } 00:24:21.899 ] 00:24:21.899 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:21.899 17:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:21.899 17:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:21.899 17:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:24:22.158 BaseBdev4 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:22.417 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:22.676 17:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:24:22.935 [ 00:24:22.935 { 00:24:22.935 "name": "BaseBdev4", 00:24:22.935 "aliases": [ 00:24:22.935 "459a5deb-8d3a-42da-aa6b-b1b6623de67f" 00:24:22.935 ], 00:24:22.935 "product_name": "Malloc disk", 00:24:22.935 "block_size": 512, 00:24:22.935 "num_blocks": 65536, 00:24:22.935 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:22.935 "assigned_rate_limits": { 00:24:22.935 "rw_ios_per_sec": 0, 00:24:22.935 "rw_mbytes_per_sec": 0, 00:24:22.935 "r_mbytes_per_sec": 0, 00:24:22.935 "w_mbytes_per_sec": 0 00:24:22.935 }, 00:24:22.935 "claimed": false, 00:24:22.935 "zoned": false, 00:24:22.935 "supported_io_types": { 00:24:22.935 "read": true, 00:24:22.935 "write": true, 00:24:22.935 "unmap": true, 00:24:22.935 "flush": true, 00:24:22.935 "reset": true, 00:24:22.935 "nvme_admin": false, 00:24:22.935 "nvme_io": false, 00:24:22.935 "nvme_io_md": false, 00:24:22.935 "write_zeroes": true, 00:24:22.935 "zcopy": true, 00:24:22.935 "get_zone_info": false, 00:24:22.935 "zone_management": false, 00:24:22.935 "zone_append": false, 00:24:22.935 "compare": false, 00:24:22.935 "compare_and_write": false, 00:24:22.935 "abort": true, 00:24:22.935 "seek_hole": false, 00:24:22.935 "seek_data": false, 00:24:22.935 "copy": true, 00:24:22.935 "nvme_iov_md": false 00:24:22.935 }, 00:24:22.935 "memory_domains": [ 00:24:22.935 { 00:24:22.935 "dma_device_id": "system", 00:24:22.935 "dma_device_type": 1 00:24:22.935 }, 00:24:22.935 { 00:24:22.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:22.935 "dma_device_type": 2 00:24:22.935 } 00:24:22.935 ], 00:24:22.935 "driver_specific": {} 00:24:22.935 } 00:24:22.935 ] 00:24:22.935 17:18:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:22.935 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:22.935 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:22.935 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:23.194 [2024-07-23 17:18:18.358155] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:23.194 [2024-07-23 17:18:18.358197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:23.194 [2024-07-23 17:18:18.358216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:23.194 [2024-07-23 17:18:18.359491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:23.194 [2024-07-23 17:18:18.359533] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:23.194 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.454 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.454 "name": "Existed_Raid", 00:24:23.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.454 "strip_size_kb": 64, 00:24:23.454 "state": "configuring", 00:24:23.454 "raid_level": "concat", 00:24:23.454 "superblock": false, 00:24:23.454 "num_base_bdevs": 4, 00:24:23.454 "num_base_bdevs_discovered": 3, 00:24:23.454 "num_base_bdevs_operational": 4, 00:24:23.454 "base_bdevs_list": [ 00:24:23.454 { 00:24:23.454 "name": "BaseBdev1", 00:24:23.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.454 "is_configured": false, 00:24:23.454 "data_offset": 0, 00:24:23.454 "data_size": 0 00:24:23.454 }, 00:24:23.454 { 00:24:23.454 "name": "BaseBdev2", 00:24:23.454 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:23.454 "is_configured": true, 00:24:23.454 "data_offset": 0, 00:24:23.454 "data_size": 65536 00:24:23.454 }, 00:24:23.454 { 00:24:23.454 "name": "BaseBdev3", 00:24:23.454 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:23.454 "is_configured": true, 00:24:23.454 "data_offset": 0, 00:24:23.454 "data_size": 65536 00:24:23.454 }, 00:24:23.454 { 00:24:23.454 "name": "BaseBdev4", 00:24:23.454 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:23.454 "is_configured": true, 00:24:23.454 "data_offset": 0, 00:24:23.454 "data_size": 65536 00:24:23.454 } 00:24:23.454 ] 00:24:23.454 }' 00:24:23.454 17:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.454 17:18:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:24.021 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:24.279 [2024-07-23 17:18:19.469076] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.279 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.280 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:24.539 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.539 "name": "Existed_Raid", 00:24:24.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.539 "strip_size_kb": 64, 00:24:24.539 "state": "configuring", 00:24:24.539 "raid_level": "concat", 00:24:24.539 "superblock": false, 00:24:24.539 "num_base_bdevs": 4, 00:24:24.539 "num_base_bdevs_discovered": 2, 00:24:24.539 "num_base_bdevs_operational": 4, 00:24:24.539 "base_bdevs_list": [ 00:24:24.539 { 00:24:24.539 "name": "BaseBdev1", 00:24:24.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.539 "is_configured": false, 00:24:24.539 "data_offset": 0, 00:24:24.539 "data_size": 0 00:24:24.539 }, 00:24:24.539 { 00:24:24.539 "name": null, 00:24:24.539 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:24.539 "is_configured": false, 00:24:24.539 "data_offset": 0, 00:24:24.539 "data_size": 65536 00:24:24.539 }, 00:24:24.539 { 00:24:24.539 "name": "BaseBdev3", 00:24:24.539 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:24.539 "is_configured": true, 00:24:24.539 "data_offset": 0, 00:24:24.539 "data_size": 65536 00:24:24.539 }, 00:24:24.539 { 00:24:24.539 "name": "BaseBdev4", 00:24:24.539 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:24.539 "is_configured": true, 00:24:24.539 "data_offset": 0, 00:24:24.539 "data_size": 65536 00:24:24.539 } 00:24:24.539 ] 00:24:24.539 }' 00:24:24.539 17:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.539 17:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:25.106 17:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.106 17:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:25.365 17:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:24:25.365 17:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:25.624 [2024-07-23 17:18:20.843998] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:25.624 BaseBdev1 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:25.624 17:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:25.883 [ 00:24:25.883 { 00:24:25.883 "name": "BaseBdev1", 00:24:25.883 "aliases": [ 00:24:25.883 "268e6657-e712-457f-8e00-b760c91c535f" 00:24:25.883 ], 00:24:25.883 "product_name": "Malloc disk", 00:24:25.883 "block_size": 512, 00:24:25.883 "num_blocks": 65536, 00:24:25.883 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:25.883 "assigned_rate_limits": { 00:24:25.883 "rw_ios_per_sec": 0, 00:24:25.883 "rw_mbytes_per_sec": 0, 00:24:25.883 "r_mbytes_per_sec": 0, 00:24:25.883 "w_mbytes_per_sec": 0 00:24:25.883 }, 00:24:25.883 "claimed": true, 00:24:25.883 "claim_type": "exclusive_write", 00:24:25.883 "zoned": false, 00:24:25.883 "supported_io_types": { 00:24:25.883 "read": true, 00:24:25.883 "write": true, 00:24:25.883 "unmap": true, 00:24:25.883 "flush": true, 00:24:25.883 "reset": true, 00:24:25.883 "nvme_admin": false, 00:24:25.883 "nvme_io": false, 00:24:25.883 "nvme_io_md": false, 00:24:25.883 "write_zeroes": true, 00:24:25.883 "zcopy": true, 00:24:25.883 "get_zone_info": false, 00:24:25.883 "zone_management": false, 00:24:25.883 "zone_append": false, 00:24:25.883 "compare": false, 00:24:25.883 "compare_and_write": false, 00:24:25.883 "abort": true, 00:24:25.883 "seek_hole": false, 00:24:25.883 "seek_data": false, 00:24:25.883 "copy": true, 00:24:25.883 "nvme_iov_md": false 00:24:25.883 }, 00:24:25.883 "memory_domains": [ 00:24:25.883 { 00:24:25.883 "dma_device_id": "system", 00:24:25.883 "dma_device_type": 1 00:24:25.883 }, 00:24:25.883 { 00:24:25.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:25.883 "dma_device_type": 2 00:24:25.883 } 00:24:25.883 ], 00:24:25.883 "driver_specific": {} 00:24:25.883 } 00:24:25.883 ] 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.883 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.142 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.142 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:26.142 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.142 "name": "Existed_Raid", 00:24:26.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.142 "strip_size_kb": 64, 00:24:26.142 "state": "configuring", 00:24:26.142 "raid_level": "concat", 00:24:26.142 "superblock": false, 00:24:26.142 "num_base_bdevs": 4, 00:24:26.142 "num_base_bdevs_discovered": 3, 00:24:26.142 "num_base_bdevs_operational": 4, 00:24:26.142 "base_bdevs_list": [ 00:24:26.142 { 00:24:26.142 "name": "BaseBdev1", 00:24:26.142 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:26.142 "is_configured": true, 00:24:26.142 "data_offset": 0, 00:24:26.142 "data_size": 65536 00:24:26.142 }, 00:24:26.142 { 00:24:26.142 "name": null, 00:24:26.142 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:26.142 "is_configured": false, 00:24:26.142 "data_offset": 0, 00:24:26.142 "data_size": 65536 00:24:26.142 }, 00:24:26.142 { 00:24:26.142 "name": "BaseBdev3", 00:24:26.142 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:26.142 "is_configured": true, 00:24:26.142 "data_offset": 0, 00:24:26.142 "data_size": 65536 00:24:26.142 }, 00:24:26.142 { 00:24:26.142 "name": "BaseBdev4", 00:24:26.142 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:26.142 "is_configured": true, 00:24:26.142 "data_offset": 0, 00:24:26.142 "data_size": 65536 00:24:26.142 } 00:24:26.142 ] 00:24:26.142 }' 00:24:26.142 17:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.142 17:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:27.077 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.078 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:27.078 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:24:27.078 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:24:27.336 [2024-07-23 17:18:22.620731] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.336 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:27.595 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:27.595 "name": "Existed_Raid", 00:24:27.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.595 "strip_size_kb": 64, 00:24:27.595 "state": "configuring", 00:24:27.595 "raid_level": "concat", 00:24:27.595 "superblock": false, 00:24:27.595 "num_base_bdevs": 4, 00:24:27.595 "num_base_bdevs_discovered": 2, 00:24:27.595 "num_base_bdevs_operational": 4, 00:24:27.595 "base_bdevs_list": [ 00:24:27.595 { 00:24:27.595 "name": "BaseBdev1", 00:24:27.595 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:27.595 "is_configured": true, 00:24:27.595 "data_offset": 0, 00:24:27.595 "data_size": 65536 00:24:27.595 }, 00:24:27.595 { 00:24:27.595 "name": null, 00:24:27.595 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:27.595 "is_configured": false, 00:24:27.595 "data_offset": 0, 00:24:27.595 "data_size": 65536 00:24:27.595 }, 00:24:27.595 { 00:24:27.595 "name": null, 00:24:27.595 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:27.595 "is_configured": false, 00:24:27.595 "data_offset": 0, 00:24:27.595 "data_size": 65536 00:24:27.595 }, 00:24:27.595 { 00:24:27.595 "name": "BaseBdev4", 00:24:27.595 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:27.595 "is_configured": true, 00:24:27.595 "data_offset": 0, 00:24:27.595 "data_size": 65536 00:24:27.595 } 00:24:27.595 ] 00:24:27.595 }' 00:24:27.595 17:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:27.595 17:18:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:28.162 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.162 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:28.421 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:24:28.421 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:24:28.680 [2024-07-23 17:18:23.908178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.680 17:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:28.939 17:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:28.939 "name": "Existed_Raid", 00:24:28.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.939 "strip_size_kb": 64, 00:24:28.939 "state": "configuring", 00:24:28.939 "raid_level": "concat", 00:24:28.939 "superblock": false, 00:24:28.939 "num_base_bdevs": 4, 00:24:28.939 "num_base_bdevs_discovered": 3, 00:24:28.939 "num_base_bdevs_operational": 4, 00:24:28.939 "base_bdevs_list": [ 00:24:28.939 { 00:24:28.939 "name": "BaseBdev1", 00:24:28.939 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:28.939 "is_configured": true, 00:24:28.939 "data_offset": 0, 00:24:28.939 "data_size": 65536 00:24:28.939 }, 00:24:28.939 { 00:24:28.939 "name": null, 00:24:28.939 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:28.939 "is_configured": false, 00:24:28.939 "data_offset": 0, 00:24:28.939 "data_size": 65536 00:24:28.940 }, 00:24:28.940 { 00:24:28.940 "name": "BaseBdev3", 00:24:28.940 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:28.940 "is_configured": true, 00:24:28.940 "data_offset": 0, 00:24:28.940 "data_size": 65536 00:24:28.940 }, 00:24:28.940 { 00:24:28.940 "name": "BaseBdev4", 00:24:28.940 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:28.940 "is_configured": true, 00:24:28.940 "data_offset": 0, 00:24:28.940 "data_size": 65536 00:24:28.940 } 00:24:28.940 ] 00:24:28.940 }' 00:24:28.940 17:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:28.940 17:18:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:29.507 17:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.507 17:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:24:29.766 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:24:29.766 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:30.024 [2024-07-23 17:18:25.267803] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:30.024 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.283 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:30.283 "name": "Existed_Raid", 00:24:30.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.283 "strip_size_kb": 64, 00:24:30.283 "state": "configuring", 00:24:30.283 "raid_level": "concat", 00:24:30.283 "superblock": false, 00:24:30.283 "num_base_bdevs": 4, 00:24:30.283 "num_base_bdevs_discovered": 2, 00:24:30.283 "num_base_bdevs_operational": 4, 00:24:30.283 "base_bdevs_list": [ 00:24:30.283 { 00:24:30.283 "name": null, 00:24:30.283 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:30.283 "is_configured": false, 00:24:30.283 "data_offset": 0, 00:24:30.283 "data_size": 65536 00:24:30.283 }, 00:24:30.283 { 00:24:30.283 "name": null, 00:24:30.283 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:30.283 "is_configured": false, 00:24:30.283 "data_offset": 0, 00:24:30.283 "data_size": 65536 00:24:30.283 }, 00:24:30.283 { 00:24:30.283 "name": "BaseBdev3", 00:24:30.283 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:30.283 "is_configured": true, 00:24:30.283 "data_offset": 0, 00:24:30.283 "data_size": 65536 00:24:30.283 }, 00:24:30.283 { 00:24:30.283 "name": "BaseBdev4", 00:24:30.283 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:30.283 "is_configured": true, 00:24:30.283 "data_offset": 0, 00:24:30.283 "data_size": 65536 00:24:30.283 } 00:24:30.283 ] 00:24:30.283 }' 00:24:30.283 17:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:30.283 17:18:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:30.851 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.851 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:24:31.110 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:24:31.110 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:24:31.369 [2024-07-23 17:18:26.675946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.369 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:31.628 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.628 "name": "Existed_Raid", 00:24:31.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.628 "strip_size_kb": 64, 00:24:31.628 "state": "configuring", 00:24:31.628 "raid_level": "concat", 00:24:31.628 "superblock": false, 00:24:31.628 "num_base_bdevs": 4, 00:24:31.628 "num_base_bdevs_discovered": 3, 00:24:31.628 "num_base_bdevs_operational": 4, 00:24:31.628 "base_bdevs_list": [ 00:24:31.628 { 00:24:31.628 "name": null, 00:24:31.628 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:31.628 "is_configured": false, 00:24:31.628 "data_offset": 0, 00:24:31.628 "data_size": 65536 00:24:31.628 }, 00:24:31.628 { 00:24:31.628 "name": "BaseBdev2", 00:24:31.628 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:31.628 "is_configured": true, 00:24:31.628 "data_offset": 0, 00:24:31.628 "data_size": 65536 00:24:31.628 }, 00:24:31.628 { 00:24:31.628 "name": "BaseBdev3", 00:24:31.628 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:31.628 "is_configured": true, 00:24:31.628 "data_offset": 0, 00:24:31.628 "data_size": 65536 00:24:31.628 }, 00:24:31.628 { 00:24:31.628 "name": "BaseBdev4", 00:24:31.628 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:31.628 "is_configured": true, 00:24:31.628 "data_offset": 0, 00:24:31.628 "data_size": 65536 00:24:31.628 } 00:24:31.628 ] 00:24:31.628 }' 00:24:31.628 17:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.628 17:18:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:32.195 17:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.195 17:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:24:32.454 17:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:24:32.454 17:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.454 17:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:24:32.712 17:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 268e6657-e712-457f-8e00-b760c91c535f 00:24:33.280 [2024-07-23 17:18:28.577397] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:24:33.280 [2024-07-23 17:18:28.577439] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1369390 00:24:33.280 [2024-07-23 17:18:28.577448] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:24:33.280 [2024-07-23 17:18:28.577649] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x136c060 00:24:33.280 [2024-07-23 17:18:28.577766] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1369390 00:24:33.280 [2024-07-23 17:18:28.577776] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1369390 00:24:33.280 [2024-07-23 17:18:28.577948] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.280 NewBaseBdev 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:33.280 17:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:33.848 17:18:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:24:34.416 [ 00:24:34.416 { 00:24:34.416 "name": "NewBaseBdev", 00:24:34.416 "aliases": [ 00:24:34.416 "268e6657-e712-457f-8e00-b760c91c535f" 00:24:34.416 ], 00:24:34.416 "product_name": "Malloc disk", 00:24:34.416 "block_size": 512, 00:24:34.416 "num_blocks": 65536, 00:24:34.416 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:34.416 "assigned_rate_limits": { 00:24:34.416 "rw_ios_per_sec": 0, 00:24:34.416 "rw_mbytes_per_sec": 0, 00:24:34.416 "r_mbytes_per_sec": 0, 00:24:34.416 "w_mbytes_per_sec": 0 00:24:34.416 }, 00:24:34.416 "claimed": true, 00:24:34.416 "claim_type": "exclusive_write", 00:24:34.416 "zoned": false, 00:24:34.416 "supported_io_types": { 00:24:34.416 "read": true, 00:24:34.416 "write": true, 00:24:34.416 "unmap": true, 00:24:34.416 "flush": true, 00:24:34.416 "reset": true, 00:24:34.416 "nvme_admin": false, 00:24:34.416 "nvme_io": false, 00:24:34.416 "nvme_io_md": false, 00:24:34.416 "write_zeroes": true, 00:24:34.416 "zcopy": true, 00:24:34.416 "get_zone_info": false, 00:24:34.416 "zone_management": false, 00:24:34.416 "zone_append": false, 00:24:34.416 "compare": false, 00:24:34.416 "compare_and_write": false, 00:24:34.416 "abort": true, 00:24:34.416 "seek_hole": false, 00:24:34.416 "seek_data": false, 00:24:34.416 "copy": true, 00:24:34.416 "nvme_iov_md": false 00:24:34.416 }, 00:24:34.416 "memory_domains": [ 00:24:34.416 { 00:24:34.416 "dma_device_id": "system", 00:24:34.416 "dma_device_type": 1 00:24:34.416 }, 00:24:34.416 { 00:24:34.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:34.416 "dma_device_type": 2 00:24:34.416 } 00:24:34.416 ], 00:24:34.416 "driver_specific": {} 00:24:34.416 } 00:24:34.416 ] 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.416 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:34.675 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.675 "name": "Existed_Raid", 00:24:34.675 "uuid": "b2e10cbf-ca02-41d8-a5f3-450b4df3763a", 00:24:34.675 "strip_size_kb": 64, 00:24:34.675 "state": "online", 00:24:34.675 "raid_level": "concat", 00:24:34.675 "superblock": false, 00:24:34.675 "num_base_bdevs": 4, 00:24:34.675 "num_base_bdevs_discovered": 4, 00:24:34.675 "num_base_bdevs_operational": 4, 00:24:34.675 "base_bdevs_list": [ 00:24:34.675 { 00:24:34.675 "name": "NewBaseBdev", 00:24:34.675 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:34.675 "is_configured": true, 00:24:34.675 "data_offset": 0, 00:24:34.675 "data_size": 65536 00:24:34.675 }, 00:24:34.675 { 00:24:34.675 "name": "BaseBdev2", 00:24:34.675 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:34.675 "is_configured": true, 00:24:34.676 "data_offset": 0, 00:24:34.676 "data_size": 65536 00:24:34.676 }, 00:24:34.676 { 00:24:34.676 "name": "BaseBdev3", 00:24:34.676 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:34.676 "is_configured": true, 00:24:34.676 "data_offset": 0, 00:24:34.676 "data_size": 65536 00:24:34.676 }, 00:24:34.676 { 00:24:34.676 "name": "BaseBdev4", 00:24:34.676 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:34.676 "is_configured": true, 00:24:34.676 "data_offset": 0, 00:24:34.676 "data_size": 65536 00:24:34.676 } 00:24:34.676 ] 00:24:34.676 }' 00:24:34.676 17:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.676 17:18:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:35.244 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:35.535 [2024-07-23 17:18:30.731430] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:35.535 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:35.535 "name": "Existed_Raid", 00:24:35.535 "aliases": [ 00:24:35.535 "b2e10cbf-ca02-41d8-a5f3-450b4df3763a" 00:24:35.535 ], 00:24:35.535 "product_name": "Raid Volume", 00:24:35.535 "block_size": 512, 00:24:35.535 "num_blocks": 262144, 00:24:35.535 "uuid": "b2e10cbf-ca02-41d8-a5f3-450b4df3763a", 00:24:35.535 "assigned_rate_limits": { 00:24:35.535 "rw_ios_per_sec": 0, 00:24:35.535 "rw_mbytes_per_sec": 0, 00:24:35.535 "r_mbytes_per_sec": 0, 00:24:35.535 "w_mbytes_per_sec": 0 00:24:35.535 }, 00:24:35.535 "claimed": false, 00:24:35.535 "zoned": false, 00:24:35.535 "supported_io_types": { 00:24:35.535 "read": true, 00:24:35.535 "write": true, 00:24:35.535 "unmap": true, 00:24:35.535 "flush": true, 00:24:35.535 "reset": true, 00:24:35.535 "nvme_admin": false, 00:24:35.535 "nvme_io": false, 00:24:35.535 "nvme_io_md": false, 00:24:35.535 "write_zeroes": true, 00:24:35.535 "zcopy": false, 00:24:35.535 "get_zone_info": false, 00:24:35.535 "zone_management": false, 00:24:35.535 "zone_append": false, 00:24:35.535 "compare": false, 00:24:35.536 "compare_and_write": false, 00:24:35.536 "abort": false, 00:24:35.536 "seek_hole": false, 00:24:35.536 "seek_data": false, 00:24:35.536 "copy": false, 00:24:35.536 "nvme_iov_md": false 00:24:35.536 }, 00:24:35.536 "memory_domains": [ 00:24:35.536 { 00:24:35.536 "dma_device_id": "system", 00:24:35.536 "dma_device_type": 1 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:35.536 "dma_device_type": 2 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "system", 00:24:35.536 "dma_device_type": 1 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:35.536 "dma_device_type": 2 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "system", 00:24:35.536 "dma_device_type": 1 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:35.536 "dma_device_type": 2 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "system", 00:24:35.536 "dma_device_type": 1 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:35.536 "dma_device_type": 2 00:24:35.536 } 00:24:35.536 ], 00:24:35.536 "driver_specific": { 00:24:35.536 "raid": { 00:24:35.536 "uuid": "b2e10cbf-ca02-41d8-a5f3-450b4df3763a", 00:24:35.536 "strip_size_kb": 64, 00:24:35.536 "state": "online", 00:24:35.536 "raid_level": "concat", 00:24:35.536 "superblock": false, 00:24:35.536 "num_base_bdevs": 4, 00:24:35.536 "num_base_bdevs_discovered": 4, 00:24:35.536 "num_base_bdevs_operational": 4, 00:24:35.536 "base_bdevs_list": [ 00:24:35.536 { 00:24:35.536 "name": "NewBaseBdev", 00:24:35.536 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:35.536 "is_configured": true, 00:24:35.536 "data_offset": 0, 00:24:35.536 "data_size": 65536 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "name": "BaseBdev2", 00:24:35.536 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:35.536 "is_configured": true, 00:24:35.536 "data_offset": 0, 00:24:35.536 "data_size": 65536 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "name": "BaseBdev3", 00:24:35.536 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:35.536 "is_configured": true, 00:24:35.536 "data_offset": 0, 00:24:35.536 "data_size": 65536 00:24:35.536 }, 00:24:35.536 { 00:24:35.536 "name": "BaseBdev4", 00:24:35.536 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:35.536 "is_configured": true, 00:24:35.536 "data_offset": 0, 00:24:35.536 "data_size": 65536 00:24:35.536 } 00:24:35.536 ] 00:24:35.536 } 00:24:35.536 } 00:24:35.536 }' 00:24:35.536 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:35.536 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:24:35.536 BaseBdev2 00:24:35.536 BaseBdev3 00:24:35.536 BaseBdev4' 00:24:35.536 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:35.536 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:24:35.536 17:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:36.104 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:36.104 "name": "NewBaseBdev", 00:24:36.104 "aliases": [ 00:24:36.104 "268e6657-e712-457f-8e00-b760c91c535f" 00:24:36.104 ], 00:24:36.104 "product_name": "Malloc disk", 00:24:36.104 "block_size": 512, 00:24:36.104 "num_blocks": 65536, 00:24:36.104 "uuid": "268e6657-e712-457f-8e00-b760c91c535f", 00:24:36.104 "assigned_rate_limits": { 00:24:36.104 "rw_ios_per_sec": 0, 00:24:36.104 "rw_mbytes_per_sec": 0, 00:24:36.104 "r_mbytes_per_sec": 0, 00:24:36.104 "w_mbytes_per_sec": 0 00:24:36.104 }, 00:24:36.104 "claimed": true, 00:24:36.104 "claim_type": "exclusive_write", 00:24:36.104 "zoned": false, 00:24:36.104 "supported_io_types": { 00:24:36.104 "read": true, 00:24:36.104 "write": true, 00:24:36.104 "unmap": true, 00:24:36.104 "flush": true, 00:24:36.104 "reset": true, 00:24:36.104 "nvme_admin": false, 00:24:36.104 "nvme_io": false, 00:24:36.104 "nvme_io_md": false, 00:24:36.104 "write_zeroes": true, 00:24:36.104 "zcopy": true, 00:24:36.104 "get_zone_info": false, 00:24:36.104 "zone_management": false, 00:24:36.104 "zone_append": false, 00:24:36.104 "compare": false, 00:24:36.104 "compare_and_write": false, 00:24:36.104 "abort": true, 00:24:36.104 "seek_hole": false, 00:24:36.104 "seek_data": false, 00:24:36.105 "copy": true, 00:24:36.105 "nvme_iov_md": false 00:24:36.105 }, 00:24:36.105 "memory_domains": [ 00:24:36.105 { 00:24:36.105 "dma_device_id": "system", 00:24:36.105 "dma_device_type": 1 00:24:36.105 }, 00:24:36.105 { 00:24:36.105 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:36.105 "dma_device_type": 2 00:24:36.105 } 00:24:36.105 ], 00:24:36.105 "driver_specific": {} 00:24:36.105 }' 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:36.105 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:36.364 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:36.622 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:36.622 "name": "BaseBdev2", 00:24:36.622 "aliases": [ 00:24:36.622 "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f" 00:24:36.622 ], 00:24:36.622 "product_name": "Malloc disk", 00:24:36.622 "block_size": 512, 00:24:36.622 "num_blocks": 65536, 00:24:36.622 "uuid": "bdf9dbe4-f4d7-4b17-b43e-e16ead226d5f", 00:24:36.622 "assigned_rate_limits": { 00:24:36.622 "rw_ios_per_sec": 0, 00:24:36.622 "rw_mbytes_per_sec": 0, 00:24:36.622 "r_mbytes_per_sec": 0, 00:24:36.622 "w_mbytes_per_sec": 0 00:24:36.622 }, 00:24:36.622 "claimed": true, 00:24:36.622 "claim_type": "exclusive_write", 00:24:36.622 "zoned": false, 00:24:36.622 "supported_io_types": { 00:24:36.622 "read": true, 00:24:36.622 "write": true, 00:24:36.622 "unmap": true, 00:24:36.622 "flush": true, 00:24:36.622 "reset": true, 00:24:36.622 "nvme_admin": false, 00:24:36.622 "nvme_io": false, 00:24:36.622 "nvme_io_md": false, 00:24:36.622 "write_zeroes": true, 00:24:36.623 "zcopy": true, 00:24:36.623 "get_zone_info": false, 00:24:36.623 "zone_management": false, 00:24:36.623 "zone_append": false, 00:24:36.623 "compare": false, 00:24:36.623 "compare_and_write": false, 00:24:36.623 "abort": true, 00:24:36.623 "seek_hole": false, 00:24:36.623 "seek_data": false, 00:24:36.623 "copy": true, 00:24:36.623 "nvme_iov_md": false 00:24:36.623 }, 00:24:36.623 "memory_domains": [ 00:24:36.623 { 00:24:36.623 "dma_device_id": "system", 00:24:36.623 "dma_device_type": 1 00:24:36.623 }, 00:24:36.623 { 00:24:36.623 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:36.623 "dma_device_type": 2 00:24:36.623 } 00:24:36.623 ], 00:24:36.623 "driver_specific": {} 00:24:36.623 }' 00:24:36.623 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:36.623 17:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:36.623 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:36.623 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:36.881 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:37.140 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:37.140 "name": "BaseBdev3", 00:24:37.140 "aliases": [ 00:24:37.140 "0ecd830a-0f6a-4064-864d-bdee2129821c" 00:24:37.140 ], 00:24:37.140 "product_name": "Malloc disk", 00:24:37.140 "block_size": 512, 00:24:37.140 "num_blocks": 65536, 00:24:37.140 "uuid": "0ecd830a-0f6a-4064-864d-bdee2129821c", 00:24:37.140 "assigned_rate_limits": { 00:24:37.140 "rw_ios_per_sec": 0, 00:24:37.140 "rw_mbytes_per_sec": 0, 00:24:37.140 "r_mbytes_per_sec": 0, 00:24:37.140 "w_mbytes_per_sec": 0 00:24:37.140 }, 00:24:37.140 "claimed": true, 00:24:37.140 "claim_type": "exclusive_write", 00:24:37.140 "zoned": false, 00:24:37.140 "supported_io_types": { 00:24:37.140 "read": true, 00:24:37.140 "write": true, 00:24:37.140 "unmap": true, 00:24:37.140 "flush": true, 00:24:37.140 "reset": true, 00:24:37.140 "nvme_admin": false, 00:24:37.140 "nvme_io": false, 00:24:37.140 "nvme_io_md": false, 00:24:37.140 "write_zeroes": true, 00:24:37.140 "zcopy": true, 00:24:37.140 "get_zone_info": false, 00:24:37.140 "zone_management": false, 00:24:37.140 "zone_append": false, 00:24:37.140 "compare": false, 00:24:37.140 "compare_and_write": false, 00:24:37.140 "abort": true, 00:24:37.140 "seek_hole": false, 00:24:37.140 "seek_data": false, 00:24:37.140 "copy": true, 00:24:37.140 "nvme_iov_md": false 00:24:37.140 }, 00:24:37.140 "memory_domains": [ 00:24:37.140 { 00:24:37.140 "dma_device_id": "system", 00:24:37.140 "dma_device_type": 1 00:24:37.140 }, 00:24:37.140 { 00:24:37.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:37.140 "dma_device_type": 2 00:24:37.140 } 00:24:37.140 ], 00:24:37.140 "driver_specific": {} 00:24:37.140 }' 00:24:37.140 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:37.399 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:37.658 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:37.658 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:37.658 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:37.658 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:24:37.658 17:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:37.917 "name": "BaseBdev4", 00:24:37.917 "aliases": [ 00:24:37.917 "459a5deb-8d3a-42da-aa6b-b1b6623de67f" 00:24:37.917 ], 00:24:37.917 "product_name": "Malloc disk", 00:24:37.917 "block_size": 512, 00:24:37.917 "num_blocks": 65536, 00:24:37.917 "uuid": "459a5deb-8d3a-42da-aa6b-b1b6623de67f", 00:24:37.917 "assigned_rate_limits": { 00:24:37.917 "rw_ios_per_sec": 0, 00:24:37.917 "rw_mbytes_per_sec": 0, 00:24:37.917 "r_mbytes_per_sec": 0, 00:24:37.917 "w_mbytes_per_sec": 0 00:24:37.917 }, 00:24:37.917 "claimed": true, 00:24:37.917 "claim_type": "exclusive_write", 00:24:37.917 "zoned": false, 00:24:37.917 "supported_io_types": { 00:24:37.917 "read": true, 00:24:37.917 "write": true, 00:24:37.917 "unmap": true, 00:24:37.917 "flush": true, 00:24:37.917 "reset": true, 00:24:37.917 "nvme_admin": false, 00:24:37.917 "nvme_io": false, 00:24:37.917 "nvme_io_md": false, 00:24:37.917 "write_zeroes": true, 00:24:37.917 "zcopy": true, 00:24:37.917 "get_zone_info": false, 00:24:37.917 "zone_management": false, 00:24:37.917 "zone_append": false, 00:24:37.917 "compare": false, 00:24:37.917 "compare_and_write": false, 00:24:37.917 "abort": true, 00:24:37.917 "seek_hole": false, 00:24:37.917 "seek_data": false, 00:24:37.917 "copy": true, 00:24:37.917 "nvme_iov_md": false 00:24:37.917 }, 00:24:37.917 "memory_domains": [ 00:24:37.917 { 00:24:37.917 "dma_device_id": "system", 00:24:37.917 "dma_device_type": 1 00:24:37.917 }, 00:24:37.917 { 00:24:37.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:37.917 "dma_device_type": 2 00:24:37.917 } 00:24:37.917 ], 00:24:37.917 "driver_specific": {} 00:24:37.917 }' 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:37.917 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:38.175 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:38.175 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:38.175 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:38.175 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:38.175 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:38.175 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:38.434 [2024-07-23 17:18:33.727092] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:38.434 [2024-07-23 17:18:33.727122] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:38.434 [2024-07-23 17:18:33.727172] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:38.434 [2024-07-23 17:18:33.727235] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:38.434 [2024-07-23 17:18:33.727247] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1369390 name Existed_Raid, state offline 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 4187060 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 4187060 ']' 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 4187060 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4187060 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4187060' 00:24:38.434 killing process with pid 4187060 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 4187060 00:24:38.434 [2024-07-23 17:18:33.799531] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:38.434 17:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 4187060 00:24:38.434 [2024-07-23 17:18:33.841670] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:38.694 17:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:24:38.694 00:24:38.694 real 0m37.484s 00:24:38.694 user 1m8.805s 00:24:38.694 sys 0m6.571s 00:24:38.694 17:18:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:38.694 17:18:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:24:38.694 ************************************ 00:24:38.694 END TEST raid_state_function_test 00:24:38.694 ************************************ 00:24:38.694 17:18:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:38.694 17:18:34 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:24:38.694 17:18:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:38.694 17:18:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:38.694 17:18:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:38.953 ************************************ 00:24:38.953 START TEST raid_state_function_test_sb 00:24:38.953 ************************************ 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:24:38.953 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=4193087 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 4193087' 00:24:38.954 Process raid pid: 4193087 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 4193087 /var/tmp/spdk-raid.sock 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 4193087 ']' 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:38.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:38.954 17:18:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:38.954 [2024-07-23 17:18:34.220534] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:24:38.954 [2024-07-23 17:18:34.220612] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:38.954 [2024-07-23 17:18:34.356782] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:39.213 [2024-07-23 17:18:34.410055] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:39.213 [2024-07-23 17:18:34.471543] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:39.213 [2024-07-23 17:18:34.471572] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:39.781 17:18:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:39.781 17:18:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:39.781 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:40.040 [2024-07-23 17:18:35.315636] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:40.040 [2024-07-23 17:18:35.315673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:40.040 [2024-07-23 17:18:35.315685] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:40.040 [2024-07-23 17:18:35.315697] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:40.040 [2024-07-23 17:18:35.315705] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:40.040 [2024-07-23 17:18:35.315716] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:40.040 [2024-07-23 17:18:35.315725] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:24:40.040 [2024-07-23 17:18:35.315735] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.040 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:40.299 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:40.299 "name": "Existed_Raid", 00:24:40.299 "uuid": "fb1f5d87-690f-491d-af50-c0c56eece4e1", 00:24:40.299 "strip_size_kb": 64, 00:24:40.299 "state": "configuring", 00:24:40.299 "raid_level": "concat", 00:24:40.299 "superblock": true, 00:24:40.299 "num_base_bdevs": 4, 00:24:40.299 "num_base_bdevs_discovered": 0, 00:24:40.299 "num_base_bdevs_operational": 4, 00:24:40.299 "base_bdevs_list": [ 00:24:40.299 { 00:24:40.299 "name": "BaseBdev1", 00:24:40.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:40.299 "is_configured": false, 00:24:40.299 "data_offset": 0, 00:24:40.299 "data_size": 0 00:24:40.299 }, 00:24:40.299 { 00:24:40.299 "name": "BaseBdev2", 00:24:40.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:40.299 "is_configured": false, 00:24:40.299 "data_offset": 0, 00:24:40.299 "data_size": 0 00:24:40.299 }, 00:24:40.299 { 00:24:40.299 "name": "BaseBdev3", 00:24:40.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:40.299 "is_configured": false, 00:24:40.299 "data_offset": 0, 00:24:40.299 "data_size": 0 00:24:40.299 }, 00:24:40.299 { 00:24:40.299 "name": "BaseBdev4", 00:24:40.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:40.299 "is_configured": false, 00:24:40.299 "data_offset": 0, 00:24:40.299 "data_size": 0 00:24:40.299 } 00:24:40.299 ] 00:24:40.299 }' 00:24:40.299 17:18:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:40.299 17:18:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:40.866 17:18:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:41.124 [2024-07-23 17:18:36.410370] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:41.124 [2024-07-23 17:18:36.410401] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbdf430 name Existed_Raid, state configuring 00:24:41.124 17:18:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:41.383 [2024-07-23 17:18:36.659069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:41.383 [2024-07-23 17:18:36.659096] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:41.383 [2024-07-23 17:18:36.659105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:41.383 [2024-07-23 17:18:36.659116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:41.383 [2024-07-23 17:18:36.659125] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:41.383 [2024-07-23 17:18:36.659136] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:41.383 [2024-07-23 17:18:36.659144] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:24:41.383 [2024-07-23 17:18:36.659155] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:24:41.383 17:18:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:24:41.641 [2024-07-23 17:18:36.913387] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:41.641 BaseBdev1 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:41.641 17:18:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:41.900 17:18:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:42.158 [ 00:24:42.158 { 00:24:42.158 "name": "BaseBdev1", 00:24:42.158 "aliases": [ 00:24:42.158 "ba37e70d-362e-4c04-aa10-37fcd9d427d5" 00:24:42.158 ], 00:24:42.158 "product_name": "Malloc disk", 00:24:42.158 "block_size": 512, 00:24:42.158 "num_blocks": 65536, 00:24:42.158 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:42.158 "assigned_rate_limits": { 00:24:42.158 "rw_ios_per_sec": 0, 00:24:42.158 "rw_mbytes_per_sec": 0, 00:24:42.158 "r_mbytes_per_sec": 0, 00:24:42.158 "w_mbytes_per_sec": 0 00:24:42.158 }, 00:24:42.158 "claimed": true, 00:24:42.158 "claim_type": "exclusive_write", 00:24:42.158 "zoned": false, 00:24:42.158 "supported_io_types": { 00:24:42.158 "read": true, 00:24:42.158 "write": true, 00:24:42.158 "unmap": true, 00:24:42.158 "flush": true, 00:24:42.158 "reset": true, 00:24:42.158 "nvme_admin": false, 00:24:42.158 "nvme_io": false, 00:24:42.158 "nvme_io_md": false, 00:24:42.158 "write_zeroes": true, 00:24:42.158 "zcopy": true, 00:24:42.158 "get_zone_info": false, 00:24:42.158 "zone_management": false, 00:24:42.158 "zone_append": false, 00:24:42.158 "compare": false, 00:24:42.158 "compare_and_write": false, 00:24:42.158 "abort": true, 00:24:42.158 "seek_hole": false, 00:24:42.158 "seek_data": false, 00:24:42.158 "copy": true, 00:24:42.158 "nvme_iov_md": false 00:24:42.158 }, 00:24:42.158 "memory_domains": [ 00:24:42.158 { 00:24:42.158 "dma_device_id": "system", 00:24:42.158 "dma_device_type": 1 00:24:42.158 }, 00:24:42.158 { 00:24:42.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.158 "dma_device_type": 2 00:24:42.158 } 00:24:42.158 ], 00:24:42.158 "driver_specific": {} 00:24:42.158 } 00:24:42.158 ] 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:42.158 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.159 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.159 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.159 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.159 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.159 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:42.417 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.417 "name": "Existed_Raid", 00:24:42.417 "uuid": "4be6e6ba-6393-42a2-becf-7ec6b9775aa3", 00:24:42.417 "strip_size_kb": 64, 00:24:42.417 "state": "configuring", 00:24:42.417 "raid_level": "concat", 00:24:42.417 "superblock": true, 00:24:42.417 "num_base_bdevs": 4, 00:24:42.417 "num_base_bdevs_discovered": 1, 00:24:42.417 "num_base_bdevs_operational": 4, 00:24:42.417 "base_bdevs_list": [ 00:24:42.417 { 00:24:42.417 "name": "BaseBdev1", 00:24:42.417 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:42.417 "is_configured": true, 00:24:42.417 "data_offset": 2048, 00:24:42.417 "data_size": 63488 00:24:42.417 }, 00:24:42.417 { 00:24:42.417 "name": "BaseBdev2", 00:24:42.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.417 "is_configured": false, 00:24:42.417 "data_offset": 0, 00:24:42.417 "data_size": 0 00:24:42.417 }, 00:24:42.417 { 00:24:42.417 "name": "BaseBdev3", 00:24:42.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.417 "is_configured": false, 00:24:42.417 "data_offset": 0, 00:24:42.417 "data_size": 0 00:24:42.417 }, 00:24:42.417 { 00:24:42.417 "name": "BaseBdev4", 00:24:42.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.417 "is_configured": false, 00:24:42.417 "data_offset": 0, 00:24:42.417 "data_size": 0 00:24:42.417 } 00:24:42.417 ] 00:24:42.417 }' 00:24:42.417 17:18:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.417 17:18:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:42.984 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:43.242 [2024-07-23 17:18:38.441427] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:43.242 [2024-07-23 17:18:38.441469] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbded60 name Existed_Raid, state configuring 00:24:43.242 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:43.501 [2024-07-23 17:18:38.694139] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:43.501 [2024-07-23 17:18:38.695538] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:43.501 [2024-07-23 17:18:38.695571] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:43.501 [2024-07-23 17:18:38.695581] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:24:43.501 [2024-07-23 17:18:38.695593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:24:43.501 [2024-07-23 17:18:38.695601] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:24:43.501 [2024-07-23 17:18:38.695612] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:43.501 "name": "Existed_Raid", 00:24:43.501 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:43.501 "strip_size_kb": 64, 00:24:43.501 "state": "configuring", 00:24:43.501 "raid_level": "concat", 00:24:43.501 "superblock": true, 00:24:43.501 "num_base_bdevs": 4, 00:24:43.501 "num_base_bdevs_discovered": 1, 00:24:43.501 "num_base_bdevs_operational": 4, 00:24:43.501 "base_bdevs_list": [ 00:24:43.501 { 00:24:43.501 "name": "BaseBdev1", 00:24:43.501 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:43.501 "is_configured": true, 00:24:43.501 "data_offset": 2048, 00:24:43.501 "data_size": 63488 00:24:43.501 }, 00:24:43.501 { 00:24:43.501 "name": "BaseBdev2", 00:24:43.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.501 "is_configured": false, 00:24:43.501 "data_offset": 0, 00:24:43.501 "data_size": 0 00:24:43.501 }, 00:24:43.501 { 00:24:43.501 "name": "BaseBdev3", 00:24:43.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.501 "is_configured": false, 00:24:43.501 "data_offset": 0, 00:24:43.501 "data_size": 0 00:24:43.501 }, 00:24:43.501 { 00:24:43.501 "name": "BaseBdev4", 00:24:43.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.501 "is_configured": false, 00:24:43.501 "data_offset": 0, 00:24:43.501 "data_size": 0 00:24:43.501 } 00:24:43.501 ] 00:24:43.501 }' 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:43.501 17:18:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:44.436 [2024-07-23 17:18:39.704169] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:44.436 BaseBdev2 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:44.436 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:44.696 17:18:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:44.696 [ 00:24:44.696 { 00:24:44.696 "name": "BaseBdev2", 00:24:44.696 "aliases": [ 00:24:44.696 "4997a348-fdcb-423b-a612-604357ac7340" 00:24:44.696 ], 00:24:44.696 "product_name": "Malloc disk", 00:24:44.696 "block_size": 512, 00:24:44.696 "num_blocks": 65536, 00:24:44.696 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:44.696 "assigned_rate_limits": { 00:24:44.696 "rw_ios_per_sec": 0, 00:24:44.696 "rw_mbytes_per_sec": 0, 00:24:44.696 "r_mbytes_per_sec": 0, 00:24:44.696 "w_mbytes_per_sec": 0 00:24:44.696 }, 00:24:44.696 "claimed": true, 00:24:44.696 "claim_type": "exclusive_write", 00:24:44.696 "zoned": false, 00:24:44.696 "supported_io_types": { 00:24:44.696 "read": true, 00:24:44.696 "write": true, 00:24:44.696 "unmap": true, 00:24:44.696 "flush": true, 00:24:44.696 "reset": true, 00:24:44.696 "nvme_admin": false, 00:24:44.696 "nvme_io": false, 00:24:44.696 "nvme_io_md": false, 00:24:44.696 "write_zeroes": true, 00:24:44.696 "zcopy": true, 00:24:44.696 "get_zone_info": false, 00:24:44.696 "zone_management": false, 00:24:44.696 "zone_append": false, 00:24:44.696 "compare": false, 00:24:44.696 "compare_and_write": false, 00:24:44.696 "abort": true, 00:24:44.696 "seek_hole": false, 00:24:44.696 "seek_data": false, 00:24:44.696 "copy": true, 00:24:44.696 "nvme_iov_md": false 00:24:44.696 }, 00:24:44.696 "memory_domains": [ 00:24:44.696 { 00:24:44.696 "dma_device_id": "system", 00:24:44.696 "dma_device_type": 1 00:24:44.696 }, 00:24:44.696 { 00:24:44.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:44.696 "dma_device_type": 2 00:24:44.696 } 00:24:44.696 ], 00:24:44.696 "driver_specific": {} 00:24:44.696 } 00:24:44.696 ] 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.696 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.955 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.955 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:44.955 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.955 "name": "Existed_Raid", 00:24:44.955 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:44.955 "strip_size_kb": 64, 00:24:44.955 "state": "configuring", 00:24:44.955 "raid_level": "concat", 00:24:44.955 "superblock": true, 00:24:44.955 "num_base_bdevs": 4, 00:24:44.955 "num_base_bdevs_discovered": 2, 00:24:44.955 "num_base_bdevs_operational": 4, 00:24:44.955 "base_bdevs_list": [ 00:24:44.955 { 00:24:44.955 "name": "BaseBdev1", 00:24:44.955 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:44.955 "is_configured": true, 00:24:44.955 "data_offset": 2048, 00:24:44.955 "data_size": 63488 00:24:44.955 }, 00:24:44.955 { 00:24:44.955 "name": "BaseBdev2", 00:24:44.955 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:44.955 "is_configured": true, 00:24:44.955 "data_offset": 2048, 00:24:44.955 "data_size": 63488 00:24:44.955 }, 00:24:44.955 { 00:24:44.955 "name": "BaseBdev3", 00:24:44.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.955 "is_configured": false, 00:24:44.955 "data_offset": 0, 00:24:44.955 "data_size": 0 00:24:44.955 }, 00:24:44.955 { 00:24:44.955 "name": "BaseBdev4", 00:24:44.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.955 "is_configured": false, 00:24:44.955 "data_offset": 0, 00:24:44.955 "data_size": 0 00:24:44.955 } 00:24:44.955 ] 00:24:44.955 }' 00:24:44.955 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.955 17:18:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:45.890 17:18:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:45.890 [2024-07-23 17:18:41.211572] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:45.890 BaseBdev3 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:45.890 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:46.149 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:46.149 [ 00:24:46.149 { 00:24:46.149 "name": "BaseBdev3", 00:24:46.149 "aliases": [ 00:24:46.149 "2763dea6-53ee-46db-b4da-1998d07714cc" 00:24:46.149 ], 00:24:46.149 "product_name": "Malloc disk", 00:24:46.149 "block_size": 512, 00:24:46.149 "num_blocks": 65536, 00:24:46.149 "uuid": "2763dea6-53ee-46db-b4da-1998d07714cc", 00:24:46.149 "assigned_rate_limits": { 00:24:46.149 "rw_ios_per_sec": 0, 00:24:46.149 "rw_mbytes_per_sec": 0, 00:24:46.149 "r_mbytes_per_sec": 0, 00:24:46.149 "w_mbytes_per_sec": 0 00:24:46.149 }, 00:24:46.149 "claimed": true, 00:24:46.149 "claim_type": "exclusive_write", 00:24:46.149 "zoned": false, 00:24:46.149 "supported_io_types": { 00:24:46.149 "read": true, 00:24:46.149 "write": true, 00:24:46.149 "unmap": true, 00:24:46.149 "flush": true, 00:24:46.149 "reset": true, 00:24:46.149 "nvme_admin": false, 00:24:46.149 "nvme_io": false, 00:24:46.149 "nvme_io_md": false, 00:24:46.149 "write_zeroes": true, 00:24:46.149 "zcopy": true, 00:24:46.149 "get_zone_info": false, 00:24:46.149 "zone_management": false, 00:24:46.149 "zone_append": false, 00:24:46.149 "compare": false, 00:24:46.149 "compare_and_write": false, 00:24:46.149 "abort": true, 00:24:46.149 "seek_hole": false, 00:24:46.149 "seek_data": false, 00:24:46.149 "copy": true, 00:24:46.149 "nvme_iov_md": false 00:24:46.149 }, 00:24:46.149 "memory_domains": [ 00:24:46.149 { 00:24:46.149 "dma_device_id": "system", 00:24:46.149 "dma_device_type": 1 00:24:46.149 }, 00:24:46.149 { 00:24:46.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:46.149 "dma_device_type": 2 00:24:46.149 } 00:24:46.149 ], 00:24:46.149 "driver_specific": {} 00:24:46.149 } 00:24:46.149 ] 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.408 "name": "Existed_Raid", 00:24:46.408 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:46.408 "strip_size_kb": 64, 00:24:46.408 "state": "configuring", 00:24:46.408 "raid_level": "concat", 00:24:46.408 "superblock": true, 00:24:46.408 "num_base_bdevs": 4, 00:24:46.408 "num_base_bdevs_discovered": 3, 00:24:46.408 "num_base_bdevs_operational": 4, 00:24:46.408 "base_bdevs_list": [ 00:24:46.408 { 00:24:46.408 "name": "BaseBdev1", 00:24:46.408 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:46.408 "is_configured": true, 00:24:46.408 "data_offset": 2048, 00:24:46.408 "data_size": 63488 00:24:46.408 }, 00:24:46.408 { 00:24:46.408 "name": "BaseBdev2", 00:24:46.408 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:46.408 "is_configured": true, 00:24:46.408 "data_offset": 2048, 00:24:46.408 "data_size": 63488 00:24:46.408 }, 00:24:46.408 { 00:24:46.408 "name": "BaseBdev3", 00:24:46.408 "uuid": "2763dea6-53ee-46db-b4da-1998d07714cc", 00:24:46.408 "is_configured": true, 00:24:46.408 "data_offset": 2048, 00:24:46.408 "data_size": 63488 00:24:46.408 }, 00:24:46.408 { 00:24:46.408 "name": "BaseBdev4", 00:24:46.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.408 "is_configured": false, 00:24:46.408 "data_offset": 0, 00:24:46.408 "data_size": 0 00:24:46.408 } 00:24:46.408 ] 00:24:46.408 }' 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.408 17:18:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:46.975 17:18:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:24:47.233 [2024-07-23 17:18:42.602701] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:47.233 [2024-07-23 17:18:42.602873] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xbde9b0 00:24:47.234 [2024-07-23 17:18:42.602888] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:24:47.234 [2024-07-23 17:18:42.603074] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc8e990 00:24:47.234 [2024-07-23 17:18:42.603191] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbde9b0 00:24:47.234 [2024-07-23 17:18:42.603201] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbde9b0 00:24:47.234 [2024-07-23 17:18:42.603294] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:47.234 BaseBdev4 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:47.234 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:47.492 17:18:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:24:47.750 [ 00:24:47.750 { 00:24:47.750 "name": "BaseBdev4", 00:24:47.750 "aliases": [ 00:24:47.750 "204309d5-1af3-4ce5-8b6e-7faab6ed4a57" 00:24:47.750 ], 00:24:47.750 "product_name": "Malloc disk", 00:24:47.750 "block_size": 512, 00:24:47.750 "num_blocks": 65536, 00:24:47.750 "uuid": "204309d5-1af3-4ce5-8b6e-7faab6ed4a57", 00:24:47.750 "assigned_rate_limits": { 00:24:47.750 "rw_ios_per_sec": 0, 00:24:47.750 "rw_mbytes_per_sec": 0, 00:24:47.750 "r_mbytes_per_sec": 0, 00:24:47.750 "w_mbytes_per_sec": 0 00:24:47.750 }, 00:24:47.750 "claimed": true, 00:24:47.751 "claim_type": "exclusive_write", 00:24:47.751 "zoned": false, 00:24:47.751 "supported_io_types": { 00:24:47.751 "read": true, 00:24:47.751 "write": true, 00:24:47.751 "unmap": true, 00:24:47.751 "flush": true, 00:24:47.751 "reset": true, 00:24:47.751 "nvme_admin": false, 00:24:47.751 "nvme_io": false, 00:24:47.751 "nvme_io_md": false, 00:24:47.751 "write_zeroes": true, 00:24:47.751 "zcopy": true, 00:24:47.751 "get_zone_info": false, 00:24:47.751 "zone_management": false, 00:24:47.751 "zone_append": false, 00:24:47.751 "compare": false, 00:24:47.751 "compare_and_write": false, 00:24:47.751 "abort": true, 00:24:47.751 "seek_hole": false, 00:24:47.751 "seek_data": false, 00:24:47.751 "copy": true, 00:24:47.751 "nvme_iov_md": false 00:24:47.751 }, 00:24:47.751 "memory_domains": [ 00:24:47.751 { 00:24:47.751 "dma_device_id": "system", 00:24:47.751 "dma_device_type": 1 00:24:47.751 }, 00:24:47.751 { 00:24:47.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:47.751 "dma_device_type": 2 00:24:47.751 } 00:24:47.751 ], 00:24:47.751 "driver_specific": {} 00:24:47.751 } 00:24:47.751 ] 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.751 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:48.010 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.010 "name": "Existed_Raid", 00:24:48.010 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:48.010 "strip_size_kb": 64, 00:24:48.010 "state": "online", 00:24:48.010 "raid_level": "concat", 00:24:48.010 "superblock": true, 00:24:48.010 "num_base_bdevs": 4, 00:24:48.010 "num_base_bdevs_discovered": 4, 00:24:48.010 "num_base_bdevs_operational": 4, 00:24:48.010 "base_bdevs_list": [ 00:24:48.010 { 00:24:48.010 "name": "BaseBdev1", 00:24:48.010 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:48.010 "is_configured": true, 00:24:48.010 "data_offset": 2048, 00:24:48.010 "data_size": 63488 00:24:48.010 }, 00:24:48.010 { 00:24:48.010 "name": "BaseBdev2", 00:24:48.010 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:48.010 "is_configured": true, 00:24:48.010 "data_offset": 2048, 00:24:48.010 "data_size": 63488 00:24:48.010 }, 00:24:48.010 { 00:24:48.010 "name": "BaseBdev3", 00:24:48.010 "uuid": "2763dea6-53ee-46db-b4da-1998d07714cc", 00:24:48.010 "is_configured": true, 00:24:48.010 "data_offset": 2048, 00:24:48.010 "data_size": 63488 00:24:48.010 }, 00:24:48.010 { 00:24:48.010 "name": "BaseBdev4", 00:24:48.010 "uuid": "204309d5-1af3-4ce5-8b6e-7faab6ed4a57", 00:24:48.010 "is_configured": true, 00:24:48.010 "data_offset": 2048, 00:24:48.010 "data_size": 63488 00:24:48.010 } 00:24:48.010 ] 00:24:48.010 }' 00:24:48.010 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.010 17:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:48.576 17:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:48.835 [2024-07-23 17:18:44.203255] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:48.835 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:48.835 "name": "Existed_Raid", 00:24:48.835 "aliases": [ 00:24:48.835 "5e2df416-6f61-47e6-bdda-1cc578fee4c9" 00:24:48.835 ], 00:24:48.835 "product_name": "Raid Volume", 00:24:48.835 "block_size": 512, 00:24:48.835 "num_blocks": 253952, 00:24:48.835 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:48.835 "assigned_rate_limits": { 00:24:48.835 "rw_ios_per_sec": 0, 00:24:48.835 "rw_mbytes_per_sec": 0, 00:24:48.835 "r_mbytes_per_sec": 0, 00:24:48.835 "w_mbytes_per_sec": 0 00:24:48.835 }, 00:24:48.835 "claimed": false, 00:24:48.835 "zoned": false, 00:24:48.835 "supported_io_types": { 00:24:48.835 "read": true, 00:24:48.835 "write": true, 00:24:48.835 "unmap": true, 00:24:48.835 "flush": true, 00:24:48.835 "reset": true, 00:24:48.835 "nvme_admin": false, 00:24:48.835 "nvme_io": false, 00:24:48.835 "nvme_io_md": false, 00:24:48.835 "write_zeroes": true, 00:24:48.835 "zcopy": false, 00:24:48.835 "get_zone_info": false, 00:24:48.835 "zone_management": false, 00:24:48.835 "zone_append": false, 00:24:48.835 "compare": false, 00:24:48.835 "compare_and_write": false, 00:24:48.835 "abort": false, 00:24:48.835 "seek_hole": false, 00:24:48.835 "seek_data": false, 00:24:48.835 "copy": false, 00:24:48.835 "nvme_iov_md": false 00:24:48.835 }, 00:24:48.835 "memory_domains": [ 00:24:48.835 { 00:24:48.835 "dma_device_id": "system", 00:24:48.835 "dma_device_type": 1 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.835 "dma_device_type": 2 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "system", 00:24:48.835 "dma_device_type": 1 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.835 "dma_device_type": 2 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "system", 00:24:48.835 "dma_device_type": 1 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.835 "dma_device_type": 2 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "system", 00:24:48.835 "dma_device_type": 1 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:48.835 "dma_device_type": 2 00:24:48.835 } 00:24:48.835 ], 00:24:48.835 "driver_specific": { 00:24:48.835 "raid": { 00:24:48.835 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:48.835 "strip_size_kb": 64, 00:24:48.835 "state": "online", 00:24:48.835 "raid_level": "concat", 00:24:48.835 "superblock": true, 00:24:48.835 "num_base_bdevs": 4, 00:24:48.835 "num_base_bdevs_discovered": 4, 00:24:48.835 "num_base_bdevs_operational": 4, 00:24:48.835 "base_bdevs_list": [ 00:24:48.835 { 00:24:48.835 "name": "BaseBdev1", 00:24:48.835 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:48.835 "is_configured": true, 00:24:48.835 "data_offset": 2048, 00:24:48.835 "data_size": 63488 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "name": "BaseBdev2", 00:24:48.835 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:48.835 "is_configured": true, 00:24:48.835 "data_offset": 2048, 00:24:48.835 "data_size": 63488 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "name": "BaseBdev3", 00:24:48.835 "uuid": "2763dea6-53ee-46db-b4da-1998d07714cc", 00:24:48.835 "is_configured": true, 00:24:48.835 "data_offset": 2048, 00:24:48.835 "data_size": 63488 00:24:48.835 }, 00:24:48.835 { 00:24:48.835 "name": "BaseBdev4", 00:24:48.835 "uuid": "204309d5-1af3-4ce5-8b6e-7faab6ed4a57", 00:24:48.835 "is_configured": true, 00:24:48.835 "data_offset": 2048, 00:24:48.835 "data_size": 63488 00:24:48.835 } 00:24:48.835 ] 00:24:48.835 } 00:24:48.835 } 00:24:48.835 }' 00:24:48.835 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:49.093 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:49.093 BaseBdev2 00:24:49.093 BaseBdev3 00:24:49.093 BaseBdev4' 00:24:49.093 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:49.093 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:49.093 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:49.661 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:49.661 "name": "BaseBdev1", 00:24:49.661 "aliases": [ 00:24:49.661 "ba37e70d-362e-4c04-aa10-37fcd9d427d5" 00:24:49.661 ], 00:24:49.661 "product_name": "Malloc disk", 00:24:49.661 "block_size": 512, 00:24:49.661 "num_blocks": 65536, 00:24:49.661 "uuid": "ba37e70d-362e-4c04-aa10-37fcd9d427d5", 00:24:49.661 "assigned_rate_limits": { 00:24:49.661 "rw_ios_per_sec": 0, 00:24:49.661 "rw_mbytes_per_sec": 0, 00:24:49.661 "r_mbytes_per_sec": 0, 00:24:49.661 "w_mbytes_per_sec": 0 00:24:49.661 }, 00:24:49.661 "claimed": true, 00:24:49.661 "claim_type": "exclusive_write", 00:24:49.661 "zoned": false, 00:24:49.661 "supported_io_types": { 00:24:49.661 "read": true, 00:24:49.661 "write": true, 00:24:49.661 "unmap": true, 00:24:49.661 "flush": true, 00:24:49.661 "reset": true, 00:24:49.661 "nvme_admin": false, 00:24:49.661 "nvme_io": false, 00:24:49.661 "nvme_io_md": false, 00:24:49.661 "write_zeroes": true, 00:24:49.661 "zcopy": true, 00:24:49.661 "get_zone_info": false, 00:24:49.661 "zone_management": false, 00:24:49.661 "zone_append": false, 00:24:49.661 "compare": false, 00:24:49.661 "compare_and_write": false, 00:24:49.661 "abort": true, 00:24:49.661 "seek_hole": false, 00:24:49.661 "seek_data": false, 00:24:49.661 "copy": true, 00:24:49.661 "nvme_iov_md": false 00:24:49.661 }, 00:24:49.661 "memory_domains": [ 00:24:49.661 { 00:24:49.661 "dma_device_id": "system", 00:24:49.661 "dma_device_type": 1 00:24:49.661 }, 00:24:49.661 { 00:24:49.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:49.661 "dma_device_type": 2 00:24:49.661 } 00:24:49.661 ], 00:24:49.661 "driver_specific": {} 00:24:49.661 }' 00:24:49.661 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:49.661 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:49.661 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:49.661 17:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:49.661 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:49.661 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:49.661 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:49.920 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:50.180 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:50.180 "name": "BaseBdev2", 00:24:50.180 "aliases": [ 00:24:50.180 "4997a348-fdcb-423b-a612-604357ac7340" 00:24:50.180 ], 00:24:50.180 "product_name": "Malloc disk", 00:24:50.180 "block_size": 512, 00:24:50.180 "num_blocks": 65536, 00:24:50.180 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:50.180 "assigned_rate_limits": { 00:24:50.180 "rw_ios_per_sec": 0, 00:24:50.180 "rw_mbytes_per_sec": 0, 00:24:50.180 "r_mbytes_per_sec": 0, 00:24:50.180 "w_mbytes_per_sec": 0 00:24:50.180 }, 00:24:50.180 "claimed": true, 00:24:50.180 "claim_type": "exclusive_write", 00:24:50.180 "zoned": false, 00:24:50.180 "supported_io_types": { 00:24:50.180 "read": true, 00:24:50.180 "write": true, 00:24:50.180 "unmap": true, 00:24:50.180 "flush": true, 00:24:50.180 "reset": true, 00:24:50.180 "nvme_admin": false, 00:24:50.180 "nvme_io": false, 00:24:50.180 "nvme_io_md": false, 00:24:50.180 "write_zeroes": true, 00:24:50.180 "zcopy": true, 00:24:50.180 "get_zone_info": false, 00:24:50.180 "zone_management": false, 00:24:50.180 "zone_append": false, 00:24:50.180 "compare": false, 00:24:50.180 "compare_and_write": false, 00:24:50.180 "abort": true, 00:24:50.180 "seek_hole": false, 00:24:50.180 "seek_data": false, 00:24:50.180 "copy": true, 00:24:50.180 "nvme_iov_md": false 00:24:50.180 }, 00:24:50.180 "memory_domains": [ 00:24:50.180 { 00:24:50.180 "dma_device_id": "system", 00:24:50.180 "dma_device_type": 1 00:24:50.180 }, 00:24:50.180 { 00:24:50.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:50.180 "dma_device_type": 2 00:24:50.180 } 00:24:50.180 ], 00:24:50.180 "driver_specific": {} 00:24:50.180 }' 00:24:50.180 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:50.439 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:50.698 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:50.698 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:50.698 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:50.698 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:50.698 17:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:50.957 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:50.957 "name": "BaseBdev3", 00:24:50.957 "aliases": [ 00:24:50.957 "2763dea6-53ee-46db-b4da-1998d07714cc" 00:24:50.957 ], 00:24:50.957 "product_name": "Malloc disk", 00:24:50.957 "block_size": 512, 00:24:50.957 "num_blocks": 65536, 00:24:50.957 "uuid": "2763dea6-53ee-46db-b4da-1998d07714cc", 00:24:50.957 "assigned_rate_limits": { 00:24:50.957 "rw_ios_per_sec": 0, 00:24:50.957 "rw_mbytes_per_sec": 0, 00:24:50.957 "r_mbytes_per_sec": 0, 00:24:50.957 "w_mbytes_per_sec": 0 00:24:50.957 }, 00:24:50.957 "claimed": true, 00:24:50.957 "claim_type": "exclusive_write", 00:24:50.957 "zoned": false, 00:24:50.957 "supported_io_types": { 00:24:50.957 "read": true, 00:24:50.957 "write": true, 00:24:50.957 "unmap": true, 00:24:50.957 "flush": true, 00:24:50.957 "reset": true, 00:24:50.957 "nvme_admin": false, 00:24:50.957 "nvme_io": false, 00:24:50.957 "nvme_io_md": false, 00:24:50.957 "write_zeroes": true, 00:24:50.957 "zcopy": true, 00:24:50.957 "get_zone_info": false, 00:24:50.957 "zone_management": false, 00:24:50.957 "zone_append": false, 00:24:50.957 "compare": false, 00:24:50.957 "compare_and_write": false, 00:24:50.957 "abort": true, 00:24:50.957 "seek_hole": false, 00:24:50.957 "seek_data": false, 00:24:50.957 "copy": true, 00:24:50.957 "nvme_iov_md": false 00:24:50.957 }, 00:24:50.957 "memory_domains": [ 00:24:50.957 { 00:24:50.957 "dma_device_id": "system", 00:24:50.957 "dma_device_type": 1 00:24:50.957 }, 00:24:50.957 { 00:24:50.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:50.957 "dma_device_type": 2 00:24:50.957 } 00:24:50.957 ], 00:24:50.957 "driver_specific": {} 00:24:50.957 }' 00:24:50.957 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:50.957 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:50.957 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:50.957 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:50.957 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:51.216 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:51.216 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:51.216 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:51.216 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:51.216 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:51.216 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:51.475 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:51.475 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:51.475 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:24:51.475 17:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:52.084 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:52.084 "name": "BaseBdev4", 00:24:52.084 "aliases": [ 00:24:52.084 "204309d5-1af3-4ce5-8b6e-7faab6ed4a57" 00:24:52.084 ], 00:24:52.084 "product_name": "Malloc disk", 00:24:52.084 "block_size": 512, 00:24:52.084 "num_blocks": 65536, 00:24:52.084 "uuid": "204309d5-1af3-4ce5-8b6e-7faab6ed4a57", 00:24:52.084 "assigned_rate_limits": { 00:24:52.084 "rw_ios_per_sec": 0, 00:24:52.084 "rw_mbytes_per_sec": 0, 00:24:52.084 "r_mbytes_per_sec": 0, 00:24:52.084 "w_mbytes_per_sec": 0 00:24:52.084 }, 00:24:52.084 "claimed": true, 00:24:52.084 "claim_type": "exclusive_write", 00:24:52.084 "zoned": false, 00:24:52.084 "supported_io_types": { 00:24:52.084 "read": true, 00:24:52.084 "write": true, 00:24:52.084 "unmap": true, 00:24:52.084 "flush": true, 00:24:52.084 "reset": true, 00:24:52.084 "nvme_admin": false, 00:24:52.084 "nvme_io": false, 00:24:52.084 "nvme_io_md": false, 00:24:52.084 "write_zeroes": true, 00:24:52.084 "zcopy": true, 00:24:52.084 "get_zone_info": false, 00:24:52.084 "zone_management": false, 00:24:52.084 "zone_append": false, 00:24:52.084 "compare": false, 00:24:52.084 "compare_and_write": false, 00:24:52.084 "abort": true, 00:24:52.084 "seek_hole": false, 00:24:52.084 "seek_data": false, 00:24:52.084 "copy": true, 00:24:52.084 "nvme_iov_md": false 00:24:52.084 }, 00:24:52.084 "memory_domains": [ 00:24:52.084 { 00:24:52.084 "dma_device_id": "system", 00:24:52.084 "dma_device_type": 1 00:24:52.084 }, 00:24:52.084 { 00:24:52.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:52.084 "dma_device_type": 2 00:24:52.084 } 00:24:52.084 ], 00:24:52.084 "driver_specific": {} 00:24:52.084 }' 00:24:52.084 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:52.084 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:52.084 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:52.084 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:52.084 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:52.367 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:52.626 [2024-07-23 17:18:47.844668] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:52.626 [2024-07-23 17:18:47.844696] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:52.626 [2024-07-23 17:18:47.844747] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:52.626 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.627 17:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:52.886 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:52.886 "name": "Existed_Raid", 00:24:52.886 "uuid": "5e2df416-6f61-47e6-bdda-1cc578fee4c9", 00:24:52.886 "strip_size_kb": 64, 00:24:52.886 "state": "offline", 00:24:52.886 "raid_level": "concat", 00:24:52.886 "superblock": true, 00:24:52.886 "num_base_bdevs": 4, 00:24:52.886 "num_base_bdevs_discovered": 3, 00:24:52.886 "num_base_bdevs_operational": 3, 00:24:52.886 "base_bdevs_list": [ 00:24:52.886 { 00:24:52.886 "name": null, 00:24:52.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.886 "is_configured": false, 00:24:52.886 "data_offset": 2048, 00:24:52.886 "data_size": 63488 00:24:52.886 }, 00:24:52.886 { 00:24:52.886 "name": "BaseBdev2", 00:24:52.886 "uuid": "4997a348-fdcb-423b-a612-604357ac7340", 00:24:52.886 "is_configured": true, 00:24:52.886 "data_offset": 2048, 00:24:52.886 "data_size": 63488 00:24:52.886 }, 00:24:52.886 { 00:24:52.886 "name": "BaseBdev3", 00:24:52.886 "uuid": "2763dea6-53ee-46db-b4da-1998d07714cc", 00:24:52.886 "is_configured": true, 00:24:52.886 "data_offset": 2048, 00:24:52.886 "data_size": 63488 00:24:52.886 }, 00:24:52.886 { 00:24:52.886 "name": "BaseBdev4", 00:24:52.886 "uuid": "204309d5-1af3-4ce5-8b6e-7faab6ed4a57", 00:24:52.886 "is_configured": true, 00:24:52.886 "data_offset": 2048, 00:24:52.886 "data_size": 63488 00:24:52.886 } 00:24:52.886 ] 00:24:52.886 }' 00:24:52.886 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:52.886 17:18:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:53.453 17:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:53.712 [2024-07-23 17:18:49.048967] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:53.712 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:53.712 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:53.712 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:53.712 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.971 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:53.971 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:53.971 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:24:54.230 [2024-07-23 17:18:49.568931] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:24:54.230 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:54.230 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:54.230 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.230 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:54.489 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:54.489 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:54.489 17:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:24:54.748 [2024-07-23 17:18:50.074649] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:24:54.748 [2024-07-23 17:18:50.074697] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbde9b0 name Existed_Raid, state offline 00:24:54.748 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:54.748 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:54.748 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.748 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:55.007 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:55.007 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:55.007 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:24:55.007 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:24:55.007 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:55.007 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:24:55.266 BaseBdev2 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:55.266 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:55.525 17:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:55.784 [ 00:24:55.784 { 00:24:55.784 "name": "BaseBdev2", 00:24:55.784 "aliases": [ 00:24:55.784 "6ff61f26-f7a6-48b4-8e32-7363737e2f4c" 00:24:55.784 ], 00:24:55.784 "product_name": "Malloc disk", 00:24:55.784 "block_size": 512, 00:24:55.784 "num_blocks": 65536, 00:24:55.784 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:24:55.784 "assigned_rate_limits": { 00:24:55.784 "rw_ios_per_sec": 0, 00:24:55.784 "rw_mbytes_per_sec": 0, 00:24:55.784 "r_mbytes_per_sec": 0, 00:24:55.784 "w_mbytes_per_sec": 0 00:24:55.784 }, 00:24:55.784 "claimed": false, 00:24:55.784 "zoned": false, 00:24:55.784 "supported_io_types": { 00:24:55.784 "read": true, 00:24:55.784 "write": true, 00:24:55.784 "unmap": true, 00:24:55.784 "flush": true, 00:24:55.784 "reset": true, 00:24:55.784 "nvme_admin": false, 00:24:55.784 "nvme_io": false, 00:24:55.784 "nvme_io_md": false, 00:24:55.784 "write_zeroes": true, 00:24:55.784 "zcopy": true, 00:24:55.784 "get_zone_info": false, 00:24:55.784 "zone_management": false, 00:24:55.784 "zone_append": false, 00:24:55.784 "compare": false, 00:24:55.784 "compare_and_write": false, 00:24:55.784 "abort": true, 00:24:55.784 "seek_hole": false, 00:24:55.784 "seek_data": false, 00:24:55.784 "copy": true, 00:24:55.784 "nvme_iov_md": false 00:24:55.784 }, 00:24:55.784 "memory_domains": [ 00:24:55.784 { 00:24:55.784 "dma_device_id": "system", 00:24:55.784 "dma_device_type": 1 00:24:55.784 }, 00:24:55.784 { 00:24:55.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:55.784 "dma_device_type": 2 00:24:55.784 } 00:24:55.784 ], 00:24:55.784 "driver_specific": {} 00:24:55.784 } 00:24:55.784 ] 00:24:55.784 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:55.784 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:55.784 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:55.784 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:24:56.043 BaseBdev3 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:56.043 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:56.301 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:24:56.560 [ 00:24:56.560 { 00:24:56.560 "name": "BaseBdev3", 00:24:56.560 "aliases": [ 00:24:56.560 "317f6897-abfe-4d93-afc9-9cd47bf5607e" 00:24:56.560 ], 00:24:56.560 "product_name": "Malloc disk", 00:24:56.560 "block_size": 512, 00:24:56.560 "num_blocks": 65536, 00:24:56.560 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:24:56.560 "assigned_rate_limits": { 00:24:56.560 "rw_ios_per_sec": 0, 00:24:56.560 "rw_mbytes_per_sec": 0, 00:24:56.560 "r_mbytes_per_sec": 0, 00:24:56.560 "w_mbytes_per_sec": 0 00:24:56.560 }, 00:24:56.560 "claimed": false, 00:24:56.560 "zoned": false, 00:24:56.560 "supported_io_types": { 00:24:56.560 "read": true, 00:24:56.560 "write": true, 00:24:56.560 "unmap": true, 00:24:56.560 "flush": true, 00:24:56.560 "reset": true, 00:24:56.560 "nvme_admin": false, 00:24:56.560 "nvme_io": false, 00:24:56.560 "nvme_io_md": false, 00:24:56.560 "write_zeroes": true, 00:24:56.560 "zcopy": true, 00:24:56.560 "get_zone_info": false, 00:24:56.560 "zone_management": false, 00:24:56.560 "zone_append": false, 00:24:56.560 "compare": false, 00:24:56.560 "compare_and_write": false, 00:24:56.560 "abort": true, 00:24:56.560 "seek_hole": false, 00:24:56.560 "seek_data": false, 00:24:56.560 "copy": true, 00:24:56.560 "nvme_iov_md": false 00:24:56.560 }, 00:24:56.560 "memory_domains": [ 00:24:56.560 { 00:24:56.560 "dma_device_id": "system", 00:24:56.560 "dma_device_type": 1 00:24:56.560 }, 00:24:56.560 { 00:24:56.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:56.560 "dma_device_type": 2 00:24:56.560 } 00:24:56.560 ], 00:24:56.560 "driver_specific": {} 00:24:56.560 } 00:24:56.560 ] 00:24:56.560 17:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:56.560 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:56.560 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:56.560 17:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:24:56.819 BaseBdev4 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:56.819 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:57.078 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:24:57.337 [ 00:24:57.337 { 00:24:57.337 "name": "BaseBdev4", 00:24:57.337 "aliases": [ 00:24:57.337 "d4108487-0595-4bb5-b1b8-894f24e421a1" 00:24:57.337 ], 00:24:57.337 "product_name": "Malloc disk", 00:24:57.337 "block_size": 512, 00:24:57.337 "num_blocks": 65536, 00:24:57.337 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:24:57.337 "assigned_rate_limits": { 00:24:57.337 "rw_ios_per_sec": 0, 00:24:57.337 "rw_mbytes_per_sec": 0, 00:24:57.337 "r_mbytes_per_sec": 0, 00:24:57.337 "w_mbytes_per_sec": 0 00:24:57.337 }, 00:24:57.337 "claimed": false, 00:24:57.337 "zoned": false, 00:24:57.337 "supported_io_types": { 00:24:57.337 "read": true, 00:24:57.337 "write": true, 00:24:57.337 "unmap": true, 00:24:57.337 "flush": true, 00:24:57.337 "reset": true, 00:24:57.337 "nvme_admin": false, 00:24:57.337 "nvme_io": false, 00:24:57.337 "nvme_io_md": false, 00:24:57.337 "write_zeroes": true, 00:24:57.337 "zcopy": true, 00:24:57.337 "get_zone_info": false, 00:24:57.337 "zone_management": false, 00:24:57.337 "zone_append": false, 00:24:57.337 "compare": false, 00:24:57.337 "compare_and_write": false, 00:24:57.337 "abort": true, 00:24:57.337 "seek_hole": false, 00:24:57.337 "seek_data": false, 00:24:57.337 "copy": true, 00:24:57.337 "nvme_iov_md": false 00:24:57.337 }, 00:24:57.337 "memory_domains": [ 00:24:57.337 { 00:24:57.337 "dma_device_id": "system", 00:24:57.337 "dma_device_type": 1 00:24:57.337 }, 00:24:57.337 { 00:24:57.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:57.337 "dma_device_type": 2 00:24:57.337 } 00:24:57.337 ], 00:24:57.337 "driver_specific": {} 00:24:57.337 } 00:24:57.337 ] 00:24:57.337 17:18:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:57.337 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:24:57.337 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:24:57.337 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:24:57.596 [2024-07-23 17:18:52.776547] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:57.596 [2024-07-23 17:18:52.776591] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:57.596 [2024-07-23 17:18:52.776612] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:57.596 [2024-07-23 17:18:52.777922] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:57.596 [2024-07-23 17:18:52.777964] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.596 17:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:57.856 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:57.856 "name": "Existed_Raid", 00:24:57.856 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:24:57.856 "strip_size_kb": 64, 00:24:57.856 "state": "configuring", 00:24:57.856 "raid_level": "concat", 00:24:57.856 "superblock": true, 00:24:57.856 "num_base_bdevs": 4, 00:24:57.856 "num_base_bdevs_discovered": 3, 00:24:57.856 "num_base_bdevs_operational": 4, 00:24:57.856 "base_bdevs_list": [ 00:24:57.856 { 00:24:57.856 "name": "BaseBdev1", 00:24:57.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.856 "is_configured": false, 00:24:57.856 "data_offset": 0, 00:24:57.856 "data_size": 0 00:24:57.856 }, 00:24:57.856 { 00:24:57.856 "name": "BaseBdev2", 00:24:57.856 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:24:57.856 "is_configured": true, 00:24:57.856 "data_offset": 2048, 00:24:57.856 "data_size": 63488 00:24:57.856 }, 00:24:57.856 { 00:24:57.856 "name": "BaseBdev3", 00:24:57.856 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:24:57.856 "is_configured": true, 00:24:57.856 "data_offset": 2048, 00:24:57.856 "data_size": 63488 00:24:57.856 }, 00:24:57.856 { 00:24:57.856 "name": "BaseBdev4", 00:24:57.856 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:24:57.856 "is_configured": true, 00:24:57.856 "data_offset": 2048, 00:24:57.856 "data_size": 63488 00:24:57.856 } 00:24:57.856 ] 00:24:57.856 }' 00:24:57.856 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:57.856 17:18:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:58.423 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:58.683 [2024-07-23 17:18:53.875404] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.683 17:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:58.942 17:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.942 "name": "Existed_Raid", 00:24:58.942 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:24:58.942 "strip_size_kb": 64, 00:24:58.942 "state": "configuring", 00:24:58.942 "raid_level": "concat", 00:24:58.942 "superblock": true, 00:24:58.942 "num_base_bdevs": 4, 00:24:58.942 "num_base_bdevs_discovered": 2, 00:24:58.942 "num_base_bdevs_operational": 4, 00:24:58.942 "base_bdevs_list": [ 00:24:58.942 { 00:24:58.942 "name": "BaseBdev1", 00:24:58.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.942 "is_configured": false, 00:24:58.942 "data_offset": 0, 00:24:58.942 "data_size": 0 00:24:58.942 }, 00:24:58.942 { 00:24:58.942 "name": null, 00:24:58.942 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:24:58.942 "is_configured": false, 00:24:58.942 "data_offset": 2048, 00:24:58.942 "data_size": 63488 00:24:58.942 }, 00:24:58.942 { 00:24:58.942 "name": "BaseBdev3", 00:24:58.942 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:24:58.942 "is_configured": true, 00:24:58.942 "data_offset": 2048, 00:24:58.942 "data_size": 63488 00:24:58.942 }, 00:24:58.942 { 00:24:58.942 "name": "BaseBdev4", 00:24:58.942 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:24:58.942 "is_configured": true, 00:24:58.942 "data_offset": 2048, 00:24:58.942 "data_size": 63488 00:24:58.942 } 00:24:58.942 ] 00:24:58.942 }' 00:24:58.942 17:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.942 17:18:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:59.510 17:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.510 17:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:25:00.077 17:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:25:00.077 17:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:25:00.336 [2024-07-23 17:18:55.516331] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:00.336 BaseBdev1 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:00.336 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:00.595 17:18:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:00.595 [ 00:25:00.595 { 00:25:00.595 "name": "BaseBdev1", 00:25:00.595 "aliases": [ 00:25:00.595 "dd3ac561-4308-4519-911a-c852eed2b72f" 00:25:00.595 ], 00:25:00.595 "product_name": "Malloc disk", 00:25:00.595 "block_size": 512, 00:25:00.595 "num_blocks": 65536, 00:25:00.595 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:00.595 "assigned_rate_limits": { 00:25:00.595 "rw_ios_per_sec": 0, 00:25:00.595 "rw_mbytes_per_sec": 0, 00:25:00.595 "r_mbytes_per_sec": 0, 00:25:00.595 "w_mbytes_per_sec": 0 00:25:00.595 }, 00:25:00.595 "claimed": true, 00:25:00.595 "claim_type": "exclusive_write", 00:25:00.595 "zoned": false, 00:25:00.595 "supported_io_types": { 00:25:00.595 "read": true, 00:25:00.595 "write": true, 00:25:00.595 "unmap": true, 00:25:00.595 "flush": true, 00:25:00.595 "reset": true, 00:25:00.595 "nvme_admin": false, 00:25:00.595 "nvme_io": false, 00:25:00.595 "nvme_io_md": false, 00:25:00.595 "write_zeroes": true, 00:25:00.595 "zcopy": true, 00:25:00.595 "get_zone_info": false, 00:25:00.595 "zone_management": false, 00:25:00.595 "zone_append": false, 00:25:00.595 "compare": false, 00:25:00.595 "compare_and_write": false, 00:25:00.595 "abort": true, 00:25:00.595 "seek_hole": false, 00:25:00.595 "seek_data": false, 00:25:00.596 "copy": true, 00:25:00.596 "nvme_iov_md": false 00:25:00.596 }, 00:25:00.596 "memory_domains": [ 00:25:00.596 { 00:25:00.596 "dma_device_id": "system", 00:25:00.596 "dma_device_type": 1 00:25:00.596 }, 00:25:00.596 { 00:25:00.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:00.596 "dma_device_type": 2 00:25:00.596 } 00:25:00.596 ], 00:25:00.596 "driver_specific": {} 00:25:00.596 } 00:25:00.596 ] 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:00.855 "name": "Existed_Raid", 00:25:00.855 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:00.855 "strip_size_kb": 64, 00:25:00.855 "state": "configuring", 00:25:00.855 "raid_level": "concat", 00:25:00.855 "superblock": true, 00:25:00.855 "num_base_bdevs": 4, 00:25:00.855 "num_base_bdevs_discovered": 3, 00:25:00.855 "num_base_bdevs_operational": 4, 00:25:00.855 "base_bdevs_list": [ 00:25:00.855 { 00:25:00.855 "name": "BaseBdev1", 00:25:00.855 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:00.855 "is_configured": true, 00:25:00.855 "data_offset": 2048, 00:25:00.855 "data_size": 63488 00:25:00.855 }, 00:25:00.855 { 00:25:00.855 "name": null, 00:25:00.855 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:00.855 "is_configured": false, 00:25:00.855 "data_offset": 2048, 00:25:00.855 "data_size": 63488 00:25:00.855 }, 00:25:00.855 { 00:25:00.855 "name": "BaseBdev3", 00:25:00.855 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:00.855 "is_configured": true, 00:25:00.855 "data_offset": 2048, 00:25:00.855 "data_size": 63488 00:25:00.855 }, 00:25:00.855 { 00:25:00.855 "name": "BaseBdev4", 00:25:00.855 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:00.855 "is_configured": true, 00:25:00.855 "data_offset": 2048, 00:25:00.855 "data_size": 63488 00:25:00.855 } 00:25:00.855 ] 00:25:00.855 }' 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:00.855 17:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:01.792 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.792 17:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:25:01.792 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:25:01.792 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:25:02.051 [2024-07-23 17:18:57.369267] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.051 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:02.310 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.310 "name": "Existed_Raid", 00:25:02.310 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:02.310 "strip_size_kb": 64, 00:25:02.310 "state": "configuring", 00:25:02.310 "raid_level": "concat", 00:25:02.310 "superblock": true, 00:25:02.310 "num_base_bdevs": 4, 00:25:02.310 "num_base_bdevs_discovered": 2, 00:25:02.310 "num_base_bdevs_operational": 4, 00:25:02.310 "base_bdevs_list": [ 00:25:02.310 { 00:25:02.310 "name": "BaseBdev1", 00:25:02.310 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:02.310 "is_configured": true, 00:25:02.310 "data_offset": 2048, 00:25:02.310 "data_size": 63488 00:25:02.310 }, 00:25:02.310 { 00:25:02.310 "name": null, 00:25:02.310 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:02.310 "is_configured": false, 00:25:02.310 "data_offset": 2048, 00:25:02.310 "data_size": 63488 00:25:02.310 }, 00:25:02.310 { 00:25:02.310 "name": null, 00:25:02.310 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:02.310 "is_configured": false, 00:25:02.310 "data_offset": 2048, 00:25:02.310 "data_size": 63488 00:25:02.310 }, 00:25:02.310 { 00:25:02.310 "name": "BaseBdev4", 00:25:02.310 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:02.310 "is_configured": true, 00:25:02.310 "data_offset": 2048, 00:25:02.310 "data_size": 63488 00:25:02.310 } 00:25:02.310 ] 00:25:02.310 }' 00:25:02.310 17:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.310 17:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:02.877 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.877 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:25:03.136 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:25:03.136 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:25:03.704 [2024-07-23 17:18:58.849206] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.704 17:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:03.962 17:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.962 "name": "Existed_Raid", 00:25:03.962 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:03.962 "strip_size_kb": 64, 00:25:03.962 "state": "configuring", 00:25:03.962 "raid_level": "concat", 00:25:03.962 "superblock": true, 00:25:03.962 "num_base_bdevs": 4, 00:25:03.962 "num_base_bdevs_discovered": 3, 00:25:03.962 "num_base_bdevs_operational": 4, 00:25:03.962 "base_bdevs_list": [ 00:25:03.962 { 00:25:03.962 "name": "BaseBdev1", 00:25:03.962 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:03.962 "is_configured": true, 00:25:03.962 "data_offset": 2048, 00:25:03.962 "data_size": 63488 00:25:03.962 }, 00:25:03.962 { 00:25:03.962 "name": null, 00:25:03.962 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:03.962 "is_configured": false, 00:25:03.962 "data_offset": 2048, 00:25:03.962 "data_size": 63488 00:25:03.962 }, 00:25:03.962 { 00:25:03.962 "name": "BaseBdev3", 00:25:03.962 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:03.962 "is_configured": true, 00:25:03.962 "data_offset": 2048, 00:25:03.962 "data_size": 63488 00:25:03.962 }, 00:25:03.962 { 00:25:03.962 "name": "BaseBdev4", 00:25:03.962 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:03.962 "is_configured": true, 00:25:03.962 "data_offset": 2048, 00:25:03.962 "data_size": 63488 00:25:03.962 } 00:25:03.962 ] 00:25:03.962 }' 00:25:03.962 17:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.962 17:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:04.529 17:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.529 17:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:25:04.788 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:25:04.788 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:05.046 [2024-07-23 17:19:00.260980] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.046 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:05.305 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:05.305 "name": "Existed_Raid", 00:25:05.305 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:05.305 "strip_size_kb": 64, 00:25:05.305 "state": "configuring", 00:25:05.305 "raid_level": "concat", 00:25:05.305 "superblock": true, 00:25:05.305 "num_base_bdevs": 4, 00:25:05.305 "num_base_bdevs_discovered": 2, 00:25:05.305 "num_base_bdevs_operational": 4, 00:25:05.305 "base_bdevs_list": [ 00:25:05.305 { 00:25:05.305 "name": null, 00:25:05.305 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:05.305 "is_configured": false, 00:25:05.305 "data_offset": 2048, 00:25:05.305 "data_size": 63488 00:25:05.305 }, 00:25:05.305 { 00:25:05.305 "name": null, 00:25:05.305 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:05.305 "is_configured": false, 00:25:05.305 "data_offset": 2048, 00:25:05.305 "data_size": 63488 00:25:05.305 }, 00:25:05.305 { 00:25:05.305 "name": "BaseBdev3", 00:25:05.305 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:05.305 "is_configured": true, 00:25:05.305 "data_offset": 2048, 00:25:05.305 "data_size": 63488 00:25:05.305 }, 00:25:05.305 { 00:25:05.305 "name": "BaseBdev4", 00:25:05.305 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:05.305 "is_configured": true, 00:25:05.305 "data_offset": 2048, 00:25:05.305 "data_size": 63488 00:25:05.305 } 00:25:05.305 ] 00:25:05.305 }' 00:25:05.305 17:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:05.305 17:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:05.872 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.872 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:25:06.131 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:25:06.131 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:25:06.389 [2024-07-23 17:19:01.620850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.389 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:06.648 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.648 "name": "Existed_Raid", 00:25:06.648 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:06.648 "strip_size_kb": 64, 00:25:06.648 "state": "configuring", 00:25:06.648 "raid_level": "concat", 00:25:06.648 "superblock": true, 00:25:06.648 "num_base_bdevs": 4, 00:25:06.648 "num_base_bdevs_discovered": 3, 00:25:06.648 "num_base_bdevs_operational": 4, 00:25:06.648 "base_bdevs_list": [ 00:25:06.648 { 00:25:06.648 "name": null, 00:25:06.648 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:06.648 "is_configured": false, 00:25:06.648 "data_offset": 2048, 00:25:06.648 "data_size": 63488 00:25:06.648 }, 00:25:06.648 { 00:25:06.648 "name": "BaseBdev2", 00:25:06.648 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:06.648 "is_configured": true, 00:25:06.648 "data_offset": 2048, 00:25:06.648 "data_size": 63488 00:25:06.648 }, 00:25:06.648 { 00:25:06.648 "name": "BaseBdev3", 00:25:06.648 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:06.648 "is_configured": true, 00:25:06.648 "data_offset": 2048, 00:25:06.648 "data_size": 63488 00:25:06.648 }, 00:25:06.648 { 00:25:06.648 "name": "BaseBdev4", 00:25:06.648 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:06.648 "is_configured": true, 00:25:06.648 "data_offset": 2048, 00:25:06.648 "data_size": 63488 00:25:06.648 } 00:25:06.648 ] 00:25:06.648 }' 00:25:06.648 17:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.648 17:19:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:07.215 17:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.215 17:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:25:07.474 17:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:25:07.474 17:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.474 17:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:25:07.733 17:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u dd3ac561-4308-4519-911a-c852eed2b72f 00:25:07.991 [2024-07-23 17:19:03.201566] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:25:07.991 [2024-07-23 17:19:03.201730] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd938f0 00:25:07.991 [2024-07-23 17:19:03.201743] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:25:07.991 [2024-07-23 17:19:03.201932] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc904a0 00:25:07.991 [2024-07-23 17:19:03.202053] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd938f0 00:25:07.991 [2024-07-23 17:19:03.202063] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd938f0 00:25:07.991 [2024-07-23 17:19:03.202157] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.991 NewBaseBdev 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:07.991 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:08.249 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:25:08.507 [ 00:25:08.507 { 00:25:08.507 "name": "NewBaseBdev", 00:25:08.507 "aliases": [ 00:25:08.507 "dd3ac561-4308-4519-911a-c852eed2b72f" 00:25:08.507 ], 00:25:08.507 "product_name": "Malloc disk", 00:25:08.507 "block_size": 512, 00:25:08.507 "num_blocks": 65536, 00:25:08.507 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:08.507 "assigned_rate_limits": { 00:25:08.507 "rw_ios_per_sec": 0, 00:25:08.507 "rw_mbytes_per_sec": 0, 00:25:08.507 "r_mbytes_per_sec": 0, 00:25:08.507 "w_mbytes_per_sec": 0 00:25:08.507 }, 00:25:08.507 "claimed": true, 00:25:08.507 "claim_type": "exclusive_write", 00:25:08.507 "zoned": false, 00:25:08.507 "supported_io_types": { 00:25:08.507 "read": true, 00:25:08.507 "write": true, 00:25:08.507 "unmap": true, 00:25:08.507 "flush": true, 00:25:08.507 "reset": true, 00:25:08.507 "nvme_admin": false, 00:25:08.507 "nvme_io": false, 00:25:08.507 "nvme_io_md": false, 00:25:08.507 "write_zeroes": true, 00:25:08.507 "zcopy": true, 00:25:08.507 "get_zone_info": false, 00:25:08.507 "zone_management": false, 00:25:08.507 "zone_append": false, 00:25:08.507 "compare": false, 00:25:08.507 "compare_and_write": false, 00:25:08.507 "abort": true, 00:25:08.507 "seek_hole": false, 00:25:08.507 "seek_data": false, 00:25:08.507 "copy": true, 00:25:08.507 "nvme_iov_md": false 00:25:08.507 }, 00:25:08.507 "memory_domains": [ 00:25:08.507 { 00:25:08.507 "dma_device_id": "system", 00:25:08.507 "dma_device_type": 1 00:25:08.507 }, 00:25:08.507 { 00:25:08.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:08.507 "dma_device_type": 2 00:25:08.507 } 00:25:08.507 ], 00:25:08.507 "driver_specific": {} 00:25:08.507 } 00:25:08.507 ] 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.507 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:08.768 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:08.768 "name": "Existed_Raid", 00:25:08.768 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:08.768 "strip_size_kb": 64, 00:25:08.768 "state": "online", 00:25:08.768 "raid_level": "concat", 00:25:08.768 "superblock": true, 00:25:08.768 "num_base_bdevs": 4, 00:25:08.768 "num_base_bdevs_discovered": 4, 00:25:08.768 "num_base_bdevs_operational": 4, 00:25:08.768 "base_bdevs_list": [ 00:25:08.768 { 00:25:08.768 "name": "NewBaseBdev", 00:25:08.768 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:08.768 "is_configured": true, 00:25:08.768 "data_offset": 2048, 00:25:08.768 "data_size": 63488 00:25:08.768 }, 00:25:08.768 { 00:25:08.768 "name": "BaseBdev2", 00:25:08.768 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:08.768 "is_configured": true, 00:25:08.768 "data_offset": 2048, 00:25:08.768 "data_size": 63488 00:25:08.768 }, 00:25:08.768 { 00:25:08.768 "name": "BaseBdev3", 00:25:08.768 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:08.768 "is_configured": true, 00:25:08.768 "data_offset": 2048, 00:25:08.768 "data_size": 63488 00:25:08.768 }, 00:25:08.768 { 00:25:08.768 "name": "BaseBdev4", 00:25:08.768 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:08.768 "is_configured": true, 00:25:08.768 "data_offset": 2048, 00:25:08.768 "data_size": 63488 00:25:08.768 } 00:25:08.768 ] 00:25:08.768 }' 00:25:08.768 17:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:08.768 17:19:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:09.369 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:09.369 [2024-07-23 17:19:04.790341] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:09.628 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:09.628 "name": "Existed_Raid", 00:25:09.628 "aliases": [ 00:25:09.628 "6900225c-bbb0-49fe-92cd-3624bd16c388" 00:25:09.628 ], 00:25:09.628 "product_name": "Raid Volume", 00:25:09.628 "block_size": 512, 00:25:09.628 "num_blocks": 253952, 00:25:09.628 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:09.628 "assigned_rate_limits": { 00:25:09.628 "rw_ios_per_sec": 0, 00:25:09.628 "rw_mbytes_per_sec": 0, 00:25:09.628 "r_mbytes_per_sec": 0, 00:25:09.628 "w_mbytes_per_sec": 0 00:25:09.628 }, 00:25:09.628 "claimed": false, 00:25:09.628 "zoned": false, 00:25:09.628 "supported_io_types": { 00:25:09.628 "read": true, 00:25:09.628 "write": true, 00:25:09.628 "unmap": true, 00:25:09.628 "flush": true, 00:25:09.628 "reset": true, 00:25:09.628 "nvme_admin": false, 00:25:09.628 "nvme_io": false, 00:25:09.628 "nvme_io_md": false, 00:25:09.628 "write_zeroes": true, 00:25:09.628 "zcopy": false, 00:25:09.628 "get_zone_info": false, 00:25:09.628 "zone_management": false, 00:25:09.628 "zone_append": false, 00:25:09.628 "compare": false, 00:25:09.628 "compare_and_write": false, 00:25:09.628 "abort": false, 00:25:09.628 "seek_hole": false, 00:25:09.628 "seek_data": false, 00:25:09.628 "copy": false, 00:25:09.628 "nvme_iov_md": false 00:25:09.628 }, 00:25:09.628 "memory_domains": [ 00:25:09.628 { 00:25:09.628 "dma_device_id": "system", 00:25:09.628 "dma_device_type": 1 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.628 "dma_device_type": 2 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "system", 00:25:09.628 "dma_device_type": 1 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.628 "dma_device_type": 2 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "system", 00:25:09.628 "dma_device_type": 1 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.628 "dma_device_type": 2 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "system", 00:25:09.628 "dma_device_type": 1 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.628 "dma_device_type": 2 00:25:09.628 } 00:25:09.628 ], 00:25:09.628 "driver_specific": { 00:25:09.628 "raid": { 00:25:09.628 "uuid": "6900225c-bbb0-49fe-92cd-3624bd16c388", 00:25:09.628 "strip_size_kb": 64, 00:25:09.628 "state": "online", 00:25:09.628 "raid_level": "concat", 00:25:09.628 "superblock": true, 00:25:09.628 "num_base_bdevs": 4, 00:25:09.628 "num_base_bdevs_discovered": 4, 00:25:09.628 "num_base_bdevs_operational": 4, 00:25:09.628 "base_bdevs_list": [ 00:25:09.628 { 00:25:09.628 "name": "NewBaseBdev", 00:25:09.628 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:09.628 "is_configured": true, 00:25:09.628 "data_offset": 2048, 00:25:09.628 "data_size": 63488 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "name": "BaseBdev2", 00:25:09.628 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:09.628 "is_configured": true, 00:25:09.628 "data_offset": 2048, 00:25:09.628 "data_size": 63488 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "name": "BaseBdev3", 00:25:09.628 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:09.628 "is_configured": true, 00:25:09.628 "data_offset": 2048, 00:25:09.628 "data_size": 63488 00:25:09.628 }, 00:25:09.628 { 00:25:09.628 "name": "BaseBdev4", 00:25:09.628 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:09.628 "is_configured": true, 00:25:09.628 "data_offset": 2048, 00:25:09.628 "data_size": 63488 00:25:09.628 } 00:25:09.628 ] 00:25:09.628 } 00:25:09.628 } 00:25:09.628 }' 00:25:09.628 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:09.628 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:25:09.628 BaseBdev2 00:25:09.628 BaseBdev3 00:25:09.628 BaseBdev4' 00:25:09.628 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:09.628 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:25:09.628 17:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:09.887 "name": "NewBaseBdev", 00:25:09.887 "aliases": [ 00:25:09.887 "dd3ac561-4308-4519-911a-c852eed2b72f" 00:25:09.887 ], 00:25:09.887 "product_name": "Malloc disk", 00:25:09.887 "block_size": 512, 00:25:09.887 "num_blocks": 65536, 00:25:09.887 "uuid": "dd3ac561-4308-4519-911a-c852eed2b72f", 00:25:09.887 "assigned_rate_limits": { 00:25:09.887 "rw_ios_per_sec": 0, 00:25:09.887 "rw_mbytes_per_sec": 0, 00:25:09.887 "r_mbytes_per_sec": 0, 00:25:09.887 "w_mbytes_per_sec": 0 00:25:09.887 }, 00:25:09.887 "claimed": true, 00:25:09.887 "claim_type": "exclusive_write", 00:25:09.887 "zoned": false, 00:25:09.887 "supported_io_types": { 00:25:09.887 "read": true, 00:25:09.887 "write": true, 00:25:09.887 "unmap": true, 00:25:09.887 "flush": true, 00:25:09.887 "reset": true, 00:25:09.887 "nvme_admin": false, 00:25:09.887 "nvme_io": false, 00:25:09.887 "nvme_io_md": false, 00:25:09.887 "write_zeroes": true, 00:25:09.887 "zcopy": true, 00:25:09.887 "get_zone_info": false, 00:25:09.887 "zone_management": false, 00:25:09.887 "zone_append": false, 00:25:09.887 "compare": false, 00:25:09.887 "compare_and_write": false, 00:25:09.887 "abort": true, 00:25:09.887 "seek_hole": false, 00:25:09.887 "seek_data": false, 00:25:09.887 "copy": true, 00:25:09.887 "nvme_iov_md": false 00:25:09.887 }, 00:25:09.887 "memory_domains": [ 00:25:09.887 { 00:25:09.887 "dma_device_id": "system", 00:25:09.887 "dma_device_type": 1 00:25:09.887 }, 00:25:09.887 { 00:25:09.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.887 "dma_device_type": 2 00:25:09.887 } 00:25:09.887 ], 00:25:09.887 "driver_specific": {} 00:25:09.887 }' 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:09.887 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:10.146 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:10.405 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:10.405 "name": "BaseBdev2", 00:25:10.405 "aliases": [ 00:25:10.405 "6ff61f26-f7a6-48b4-8e32-7363737e2f4c" 00:25:10.405 ], 00:25:10.405 "product_name": "Malloc disk", 00:25:10.405 "block_size": 512, 00:25:10.405 "num_blocks": 65536, 00:25:10.405 "uuid": "6ff61f26-f7a6-48b4-8e32-7363737e2f4c", 00:25:10.405 "assigned_rate_limits": { 00:25:10.405 "rw_ios_per_sec": 0, 00:25:10.405 "rw_mbytes_per_sec": 0, 00:25:10.405 "r_mbytes_per_sec": 0, 00:25:10.405 "w_mbytes_per_sec": 0 00:25:10.405 }, 00:25:10.405 "claimed": true, 00:25:10.405 "claim_type": "exclusive_write", 00:25:10.405 "zoned": false, 00:25:10.405 "supported_io_types": { 00:25:10.405 "read": true, 00:25:10.405 "write": true, 00:25:10.405 "unmap": true, 00:25:10.405 "flush": true, 00:25:10.405 "reset": true, 00:25:10.405 "nvme_admin": false, 00:25:10.405 "nvme_io": false, 00:25:10.405 "nvme_io_md": false, 00:25:10.405 "write_zeroes": true, 00:25:10.405 "zcopy": true, 00:25:10.405 "get_zone_info": false, 00:25:10.405 "zone_management": false, 00:25:10.405 "zone_append": false, 00:25:10.405 "compare": false, 00:25:10.405 "compare_and_write": false, 00:25:10.405 "abort": true, 00:25:10.405 "seek_hole": false, 00:25:10.405 "seek_data": false, 00:25:10.405 "copy": true, 00:25:10.405 "nvme_iov_md": false 00:25:10.405 }, 00:25:10.405 "memory_domains": [ 00:25:10.405 { 00:25:10.405 "dma_device_id": "system", 00:25:10.405 "dma_device_type": 1 00:25:10.405 }, 00:25:10.405 { 00:25:10.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:10.405 "dma_device_type": 2 00:25:10.405 } 00:25:10.405 ], 00:25:10.405 "driver_specific": {} 00:25:10.405 }' 00:25:10.405 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.405 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.405 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:10.405 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.663 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:10.663 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:10.663 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.663 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:10.663 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:10.664 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.664 17:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:10.664 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:10.664 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:10.664 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:25:10.664 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:10.922 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:10.922 "name": "BaseBdev3", 00:25:10.922 "aliases": [ 00:25:10.922 "317f6897-abfe-4d93-afc9-9cd47bf5607e" 00:25:10.922 ], 00:25:10.922 "product_name": "Malloc disk", 00:25:10.922 "block_size": 512, 00:25:10.922 "num_blocks": 65536, 00:25:10.922 "uuid": "317f6897-abfe-4d93-afc9-9cd47bf5607e", 00:25:10.922 "assigned_rate_limits": { 00:25:10.922 "rw_ios_per_sec": 0, 00:25:10.922 "rw_mbytes_per_sec": 0, 00:25:10.922 "r_mbytes_per_sec": 0, 00:25:10.922 "w_mbytes_per_sec": 0 00:25:10.922 }, 00:25:10.922 "claimed": true, 00:25:10.922 "claim_type": "exclusive_write", 00:25:10.922 "zoned": false, 00:25:10.922 "supported_io_types": { 00:25:10.922 "read": true, 00:25:10.922 "write": true, 00:25:10.922 "unmap": true, 00:25:10.922 "flush": true, 00:25:10.922 "reset": true, 00:25:10.922 "nvme_admin": false, 00:25:10.922 "nvme_io": false, 00:25:10.922 "nvme_io_md": false, 00:25:10.922 "write_zeroes": true, 00:25:10.922 "zcopy": true, 00:25:10.922 "get_zone_info": false, 00:25:10.922 "zone_management": false, 00:25:10.922 "zone_append": false, 00:25:10.922 "compare": false, 00:25:10.922 "compare_and_write": false, 00:25:10.922 "abort": true, 00:25:10.922 "seek_hole": false, 00:25:10.922 "seek_data": false, 00:25:10.922 "copy": true, 00:25:10.922 "nvme_iov_md": false 00:25:10.922 }, 00:25:10.922 "memory_domains": [ 00:25:10.922 { 00:25:10.922 "dma_device_id": "system", 00:25:10.922 "dma_device_type": 1 00:25:10.922 }, 00:25:10.922 { 00:25:10.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:10.922 "dma_device_type": 2 00:25:10.922 } 00:25:10.922 ], 00:25:10.922 "driver_specific": {} 00:25:10.922 }' 00:25:10.922 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:10.922 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:11.180 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:11.438 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:25:11.438 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:11.438 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:11.438 "name": "BaseBdev4", 00:25:11.438 "aliases": [ 00:25:11.438 "d4108487-0595-4bb5-b1b8-894f24e421a1" 00:25:11.438 ], 00:25:11.438 "product_name": "Malloc disk", 00:25:11.438 "block_size": 512, 00:25:11.438 "num_blocks": 65536, 00:25:11.438 "uuid": "d4108487-0595-4bb5-b1b8-894f24e421a1", 00:25:11.438 "assigned_rate_limits": { 00:25:11.438 "rw_ios_per_sec": 0, 00:25:11.438 "rw_mbytes_per_sec": 0, 00:25:11.438 "r_mbytes_per_sec": 0, 00:25:11.438 "w_mbytes_per_sec": 0 00:25:11.438 }, 00:25:11.438 "claimed": true, 00:25:11.438 "claim_type": "exclusive_write", 00:25:11.438 "zoned": false, 00:25:11.438 "supported_io_types": { 00:25:11.438 "read": true, 00:25:11.438 "write": true, 00:25:11.438 "unmap": true, 00:25:11.438 "flush": true, 00:25:11.438 "reset": true, 00:25:11.438 "nvme_admin": false, 00:25:11.438 "nvme_io": false, 00:25:11.438 "nvme_io_md": false, 00:25:11.438 "write_zeroes": true, 00:25:11.438 "zcopy": true, 00:25:11.438 "get_zone_info": false, 00:25:11.438 "zone_management": false, 00:25:11.438 "zone_append": false, 00:25:11.438 "compare": false, 00:25:11.438 "compare_and_write": false, 00:25:11.438 "abort": true, 00:25:11.438 "seek_hole": false, 00:25:11.438 "seek_data": false, 00:25:11.438 "copy": true, 00:25:11.438 "nvme_iov_md": false 00:25:11.438 }, 00:25:11.438 "memory_domains": [ 00:25:11.438 { 00:25:11.438 "dma_device_id": "system", 00:25:11.438 "dma_device_type": 1 00:25:11.438 }, 00:25:11.438 { 00:25:11.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:11.438 "dma_device_type": 2 00:25:11.438 } 00:25:11.438 ], 00:25:11.438 "driver_specific": {} 00:25:11.438 }' 00:25:11.438 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:11.697 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:11.697 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:11.697 17:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:11.697 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:11.697 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:11.697 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:11.955 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:11.955 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:11.955 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:11.955 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:11.955 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:11.955 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:12.214 [2024-07-23 17:19:07.545330] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:12.214 [2024-07-23 17:19:07.545359] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:12.214 [2024-07-23 17:19:07.545416] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:12.214 [2024-07-23 17:19:07.545475] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:12.214 [2024-07-23 17:19:07.545487] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd938f0 name Existed_Raid, state offline 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 4193087 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 4193087 ']' 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 4193087 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4193087 00:25:12.214 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:12.215 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:12.215 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4193087' 00:25:12.215 killing process with pid 4193087 00:25:12.215 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 4193087 00:25:12.215 [2024-07-23 17:19:07.630361] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:12.215 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 4193087 00:25:12.474 [2024-07-23 17:19:07.667186] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:12.474 17:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:25:12.474 00:25:12.474 real 0m33.710s 00:25:12.474 user 1m1.992s 00:25:12.474 sys 0m6.021s 00:25:12.474 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:12.474 17:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:12.474 ************************************ 00:25:12.474 END TEST raid_state_function_test_sb 00:25:12.474 ************************************ 00:25:12.733 17:19:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:12.733 17:19:07 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:25:12.733 17:19:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:25:12.733 17:19:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:12.733 17:19:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:12.733 ************************************ 00:25:12.733 START TEST raid_superblock_test 00:25:12.733 ************************************ 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=4512 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 4512 /var/tmp/spdk-raid.sock 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 4512 ']' 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:12.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:12.733 17:19:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:12.733 [2024-07-23 17:19:08.052216] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:25:12.733 [2024-07-23 17:19:08.052351] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4512 ] 00:25:12.992 [2024-07-23 17:19:08.246949] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:12.992 [2024-07-23 17:19:08.300277] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:12.992 [2024-07-23 17:19:08.368445] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:12.992 [2024-07-23 17:19:08.368482] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:13.559 17:19:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:25:13.820 malloc1 00:25:13.820 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:14.079 [2024-07-23 17:19:09.412254] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:14.079 [2024-07-23 17:19:09.412302] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.079 [2024-07-23 17:19:09.412325] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d54070 00:25:14.079 [2024-07-23 17:19:09.412338] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.079 [2024-07-23 17:19:09.413982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.079 [2024-07-23 17:19:09.414011] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:14.079 pt1 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:14.079 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:25:14.337 malloc2 00:25:14.337 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:14.595 [2024-07-23 17:19:09.906286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:14.595 [2024-07-23 17:19:09.906332] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.595 [2024-07-23 17:19:09.906350] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3a920 00:25:14.595 [2024-07-23 17:19:09.906363] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.595 [2024-07-23 17:19:09.908083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.595 [2024-07-23 17:19:09.908112] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:14.595 pt2 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:14.595 17:19:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:25:14.853 malloc3 00:25:14.853 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:25:15.111 [2024-07-23 17:19:10.401304] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:25:15.111 [2024-07-23 17:19:10.401352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.111 [2024-07-23 17:19:10.401370] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d4c3e0 00:25:15.111 [2024-07-23 17:19:10.401383] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.111 [2024-07-23 17:19:10.402986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.111 [2024-07-23 17:19:10.403014] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:25:15.111 pt3 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:15.111 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:25:15.369 malloc4 00:25:15.369 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:25:15.627 [2024-07-23 17:19:10.896387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:25:15.627 [2024-07-23 17:19:10.896432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.627 [2024-07-23 17:19:10.896449] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d4e870 00:25:15.627 [2024-07-23 17:19:10.896461] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.627 [2024-07-23 17:19:10.898039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.627 [2024-07-23 17:19:10.898066] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:25:15.627 pt4 00:25:15.627 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:15.627 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:15.627 17:19:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:25:15.886 [2024-07-23 17:19:11.141059] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:15.886 [2024-07-23 17:19:11.142375] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:15.886 [2024-07-23 17:19:11.142429] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:25:15.886 [2024-07-23 17:19:11.142472] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:25:15.886 [2024-07-23 17:19:11.142642] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d4fe80 00:25:15.886 [2024-07-23 17:19:11.142653] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:25:15.886 [2024-07-23 17:19:11.142855] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d4c670 00:25:15.886 [2024-07-23 17:19:11.143007] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d4fe80 00:25:15.886 [2024-07-23 17:19:11.143018] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d4fe80 00:25:15.886 [2024-07-23 17:19:11.143115] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.886 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.145 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.145 "name": "raid_bdev1", 00:25:16.145 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:16.145 "strip_size_kb": 64, 00:25:16.145 "state": "online", 00:25:16.145 "raid_level": "concat", 00:25:16.145 "superblock": true, 00:25:16.145 "num_base_bdevs": 4, 00:25:16.145 "num_base_bdevs_discovered": 4, 00:25:16.145 "num_base_bdevs_operational": 4, 00:25:16.145 "base_bdevs_list": [ 00:25:16.145 { 00:25:16.145 "name": "pt1", 00:25:16.145 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:16.145 "is_configured": true, 00:25:16.145 "data_offset": 2048, 00:25:16.145 "data_size": 63488 00:25:16.145 }, 00:25:16.145 { 00:25:16.145 "name": "pt2", 00:25:16.145 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:16.145 "is_configured": true, 00:25:16.145 "data_offset": 2048, 00:25:16.145 "data_size": 63488 00:25:16.145 }, 00:25:16.145 { 00:25:16.145 "name": "pt3", 00:25:16.145 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:16.145 "is_configured": true, 00:25:16.145 "data_offset": 2048, 00:25:16.145 "data_size": 63488 00:25:16.145 }, 00:25:16.145 { 00:25:16.145 "name": "pt4", 00:25:16.145 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:16.145 "is_configured": true, 00:25:16.145 "data_offset": 2048, 00:25:16.145 "data_size": 63488 00:25:16.145 } 00:25:16.145 ] 00:25:16.145 }' 00:25:16.145 17:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.145 17:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:17.081 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:17.340 [2024-07-23 17:19:12.573129] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:17.340 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:17.340 "name": "raid_bdev1", 00:25:17.340 "aliases": [ 00:25:17.340 "d86fed13-1cf0-4239-8b78-ce9dfaa88837" 00:25:17.340 ], 00:25:17.340 "product_name": "Raid Volume", 00:25:17.340 "block_size": 512, 00:25:17.340 "num_blocks": 253952, 00:25:17.340 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:17.340 "assigned_rate_limits": { 00:25:17.340 "rw_ios_per_sec": 0, 00:25:17.340 "rw_mbytes_per_sec": 0, 00:25:17.340 "r_mbytes_per_sec": 0, 00:25:17.340 "w_mbytes_per_sec": 0 00:25:17.340 }, 00:25:17.340 "claimed": false, 00:25:17.340 "zoned": false, 00:25:17.340 "supported_io_types": { 00:25:17.340 "read": true, 00:25:17.340 "write": true, 00:25:17.340 "unmap": true, 00:25:17.340 "flush": true, 00:25:17.340 "reset": true, 00:25:17.340 "nvme_admin": false, 00:25:17.340 "nvme_io": false, 00:25:17.340 "nvme_io_md": false, 00:25:17.340 "write_zeroes": true, 00:25:17.340 "zcopy": false, 00:25:17.340 "get_zone_info": false, 00:25:17.340 "zone_management": false, 00:25:17.340 "zone_append": false, 00:25:17.340 "compare": false, 00:25:17.340 "compare_and_write": false, 00:25:17.340 "abort": false, 00:25:17.340 "seek_hole": false, 00:25:17.340 "seek_data": false, 00:25:17.340 "copy": false, 00:25:17.340 "nvme_iov_md": false 00:25:17.340 }, 00:25:17.340 "memory_domains": [ 00:25:17.340 { 00:25:17.340 "dma_device_id": "system", 00:25:17.340 "dma_device_type": 1 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.340 "dma_device_type": 2 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "system", 00:25:17.340 "dma_device_type": 1 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.340 "dma_device_type": 2 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "system", 00:25:17.340 "dma_device_type": 1 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.340 "dma_device_type": 2 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "system", 00:25:17.340 "dma_device_type": 1 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.340 "dma_device_type": 2 00:25:17.340 } 00:25:17.340 ], 00:25:17.340 "driver_specific": { 00:25:17.340 "raid": { 00:25:17.340 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:17.340 "strip_size_kb": 64, 00:25:17.340 "state": "online", 00:25:17.340 "raid_level": "concat", 00:25:17.340 "superblock": true, 00:25:17.340 "num_base_bdevs": 4, 00:25:17.340 "num_base_bdevs_discovered": 4, 00:25:17.340 "num_base_bdevs_operational": 4, 00:25:17.340 "base_bdevs_list": [ 00:25:17.340 { 00:25:17.340 "name": "pt1", 00:25:17.340 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:17.340 "is_configured": true, 00:25:17.340 "data_offset": 2048, 00:25:17.340 "data_size": 63488 00:25:17.340 }, 00:25:17.340 { 00:25:17.340 "name": "pt2", 00:25:17.341 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:17.341 "is_configured": true, 00:25:17.341 "data_offset": 2048, 00:25:17.341 "data_size": 63488 00:25:17.341 }, 00:25:17.341 { 00:25:17.341 "name": "pt3", 00:25:17.341 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:17.341 "is_configured": true, 00:25:17.341 "data_offset": 2048, 00:25:17.341 "data_size": 63488 00:25:17.341 }, 00:25:17.341 { 00:25:17.341 "name": "pt4", 00:25:17.341 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:17.341 "is_configured": true, 00:25:17.341 "data_offset": 2048, 00:25:17.341 "data_size": 63488 00:25:17.341 } 00:25:17.341 ] 00:25:17.341 } 00:25:17.341 } 00:25:17.341 }' 00:25:17.341 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:17.341 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:17.341 pt2 00:25:17.341 pt3 00:25:17.341 pt4' 00:25:17.341 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:17.341 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:17.341 17:19:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:17.908 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:17.908 "name": "pt1", 00:25:17.908 "aliases": [ 00:25:17.908 "00000000-0000-0000-0000-000000000001" 00:25:17.908 ], 00:25:17.908 "product_name": "passthru", 00:25:17.908 "block_size": 512, 00:25:17.908 "num_blocks": 65536, 00:25:17.908 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:17.908 "assigned_rate_limits": { 00:25:17.908 "rw_ios_per_sec": 0, 00:25:17.908 "rw_mbytes_per_sec": 0, 00:25:17.908 "r_mbytes_per_sec": 0, 00:25:17.908 "w_mbytes_per_sec": 0 00:25:17.908 }, 00:25:17.908 "claimed": true, 00:25:17.908 "claim_type": "exclusive_write", 00:25:17.908 "zoned": false, 00:25:17.908 "supported_io_types": { 00:25:17.908 "read": true, 00:25:17.908 "write": true, 00:25:17.908 "unmap": true, 00:25:17.908 "flush": true, 00:25:17.908 "reset": true, 00:25:17.908 "nvme_admin": false, 00:25:17.909 "nvme_io": false, 00:25:17.909 "nvme_io_md": false, 00:25:17.909 "write_zeroes": true, 00:25:17.909 "zcopy": true, 00:25:17.909 "get_zone_info": false, 00:25:17.909 "zone_management": false, 00:25:17.909 "zone_append": false, 00:25:17.909 "compare": false, 00:25:17.909 "compare_and_write": false, 00:25:17.909 "abort": true, 00:25:17.909 "seek_hole": false, 00:25:17.909 "seek_data": false, 00:25:17.909 "copy": true, 00:25:17.909 "nvme_iov_md": false 00:25:17.909 }, 00:25:17.909 "memory_domains": [ 00:25:17.909 { 00:25:17.909 "dma_device_id": "system", 00:25:17.909 "dma_device_type": 1 00:25:17.909 }, 00:25:17.909 { 00:25:17.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:17.909 "dma_device_type": 2 00:25:17.909 } 00:25:17.909 ], 00:25:17.909 "driver_specific": { 00:25:17.909 "passthru": { 00:25:17.909 "name": "pt1", 00:25:17.909 "base_bdev_name": "malloc1" 00:25:17.909 } 00:25:17.909 } 00:25:17.909 }' 00:25:17.909 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:17.909 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:17.909 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:17.909 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:17.909 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.170 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:18.170 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.170 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.170 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:18.170 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.170 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.430 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:18.430 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:18.430 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:18.430 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:18.688 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:18.688 "name": "pt2", 00:25:18.688 "aliases": [ 00:25:18.688 "00000000-0000-0000-0000-000000000002" 00:25:18.688 ], 00:25:18.688 "product_name": "passthru", 00:25:18.688 "block_size": 512, 00:25:18.688 "num_blocks": 65536, 00:25:18.688 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:18.688 "assigned_rate_limits": { 00:25:18.688 "rw_ios_per_sec": 0, 00:25:18.688 "rw_mbytes_per_sec": 0, 00:25:18.688 "r_mbytes_per_sec": 0, 00:25:18.688 "w_mbytes_per_sec": 0 00:25:18.688 }, 00:25:18.688 "claimed": true, 00:25:18.688 "claim_type": "exclusive_write", 00:25:18.688 "zoned": false, 00:25:18.688 "supported_io_types": { 00:25:18.688 "read": true, 00:25:18.688 "write": true, 00:25:18.688 "unmap": true, 00:25:18.688 "flush": true, 00:25:18.688 "reset": true, 00:25:18.688 "nvme_admin": false, 00:25:18.688 "nvme_io": false, 00:25:18.688 "nvme_io_md": false, 00:25:18.688 "write_zeroes": true, 00:25:18.688 "zcopy": true, 00:25:18.688 "get_zone_info": false, 00:25:18.688 "zone_management": false, 00:25:18.688 "zone_append": false, 00:25:18.688 "compare": false, 00:25:18.688 "compare_and_write": false, 00:25:18.688 "abort": true, 00:25:18.688 "seek_hole": false, 00:25:18.688 "seek_data": false, 00:25:18.688 "copy": true, 00:25:18.688 "nvme_iov_md": false 00:25:18.688 }, 00:25:18.688 "memory_domains": [ 00:25:18.688 { 00:25:18.688 "dma_device_id": "system", 00:25:18.688 "dma_device_type": 1 00:25:18.688 }, 00:25:18.688 { 00:25:18.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:18.688 "dma_device_type": 2 00:25:18.688 } 00:25:18.688 ], 00:25:18.688 "driver_specific": { 00:25:18.688 "passthru": { 00:25:18.688 "name": "pt2", 00:25:18.688 "base_bdev_name": "malloc2" 00:25:18.688 } 00:25:18.688 } 00:25:18.688 }' 00:25:18.688 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.688 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.688 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:18.688 17:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.688 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:25:18.947 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:19.205 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:19.205 "name": "pt3", 00:25:19.205 "aliases": [ 00:25:19.205 "00000000-0000-0000-0000-000000000003" 00:25:19.205 ], 00:25:19.205 "product_name": "passthru", 00:25:19.205 "block_size": 512, 00:25:19.205 "num_blocks": 65536, 00:25:19.205 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:19.205 "assigned_rate_limits": { 00:25:19.205 "rw_ios_per_sec": 0, 00:25:19.205 "rw_mbytes_per_sec": 0, 00:25:19.205 "r_mbytes_per_sec": 0, 00:25:19.205 "w_mbytes_per_sec": 0 00:25:19.205 }, 00:25:19.205 "claimed": true, 00:25:19.205 "claim_type": "exclusive_write", 00:25:19.205 "zoned": false, 00:25:19.205 "supported_io_types": { 00:25:19.205 "read": true, 00:25:19.205 "write": true, 00:25:19.205 "unmap": true, 00:25:19.205 "flush": true, 00:25:19.205 "reset": true, 00:25:19.205 "nvme_admin": false, 00:25:19.205 "nvme_io": false, 00:25:19.205 "nvme_io_md": false, 00:25:19.205 "write_zeroes": true, 00:25:19.205 "zcopy": true, 00:25:19.205 "get_zone_info": false, 00:25:19.205 "zone_management": false, 00:25:19.205 "zone_append": false, 00:25:19.205 "compare": false, 00:25:19.205 "compare_and_write": false, 00:25:19.205 "abort": true, 00:25:19.205 "seek_hole": false, 00:25:19.205 "seek_data": false, 00:25:19.205 "copy": true, 00:25:19.205 "nvme_iov_md": false 00:25:19.205 }, 00:25:19.205 "memory_domains": [ 00:25:19.205 { 00:25:19.205 "dma_device_id": "system", 00:25:19.205 "dma_device_type": 1 00:25:19.205 }, 00:25:19.205 { 00:25:19.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:19.205 "dma_device_type": 2 00:25:19.205 } 00:25:19.205 ], 00:25:19.205 "driver_specific": { 00:25:19.205 "passthru": { 00:25:19.205 "name": "pt3", 00:25:19.205 "base_bdev_name": "malloc3" 00:25:19.205 } 00:25:19.205 } 00:25:19.205 }' 00:25:19.205 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:19.464 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:19.464 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:19.464 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:19.464 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:19.464 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:19.464 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:19.722 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:19.722 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:19.722 17:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.722 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.722 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:19.722 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:19.981 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:25:19.981 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:20.240 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:20.240 "name": "pt4", 00:25:20.240 "aliases": [ 00:25:20.240 "00000000-0000-0000-0000-000000000004" 00:25:20.240 ], 00:25:20.240 "product_name": "passthru", 00:25:20.240 "block_size": 512, 00:25:20.240 "num_blocks": 65536, 00:25:20.240 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:20.240 "assigned_rate_limits": { 00:25:20.240 "rw_ios_per_sec": 0, 00:25:20.240 "rw_mbytes_per_sec": 0, 00:25:20.240 "r_mbytes_per_sec": 0, 00:25:20.240 "w_mbytes_per_sec": 0 00:25:20.240 }, 00:25:20.240 "claimed": true, 00:25:20.240 "claim_type": "exclusive_write", 00:25:20.240 "zoned": false, 00:25:20.240 "supported_io_types": { 00:25:20.240 "read": true, 00:25:20.240 "write": true, 00:25:20.240 "unmap": true, 00:25:20.240 "flush": true, 00:25:20.240 "reset": true, 00:25:20.240 "nvme_admin": false, 00:25:20.240 "nvme_io": false, 00:25:20.240 "nvme_io_md": false, 00:25:20.240 "write_zeroes": true, 00:25:20.240 "zcopy": true, 00:25:20.240 "get_zone_info": false, 00:25:20.240 "zone_management": false, 00:25:20.240 "zone_append": false, 00:25:20.240 "compare": false, 00:25:20.240 "compare_and_write": false, 00:25:20.240 "abort": true, 00:25:20.240 "seek_hole": false, 00:25:20.240 "seek_data": false, 00:25:20.240 "copy": true, 00:25:20.240 "nvme_iov_md": false 00:25:20.240 }, 00:25:20.240 "memory_domains": [ 00:25:20.240 { 00:25:20.240 "dma_device_id": "system", 00:25:20.240 "dma_device_type": 1 00:25:20.240 }, 00:25:20.240 { 00:25:20.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:20.240 "dma_device_type": 2 00:25:20.240 } 00:25:20.240 ], 00:25:20.240 "driver_specific": { 00:25:20.240 "passthru": { 00:25:20.240 "name": "pt4", 00:25:20.240 "base_bdev_name": "malloc4" 00:25:20.240 } 00:25:20.240 } 00:25:20.240 }' 00:25:20.498 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:20.498 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:20.498 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:20.498 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:20.498 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:20.756 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:20.756 17:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:20.756 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:20.756 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:20.756 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:20.756 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:21.015 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:21.015 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:21.015 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:21.273 [2024-07-23 17:19:16.447396] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:21.273 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d86fed13-1cf0-4239-8b78-ce9dfaa88837 00:25:21.273 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d86fed13-1cf0-4239-8b78-ce9dfaa88837 ']' 00:25:21.273 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:21.842 [2024-07-23 17:19:16.956446] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:21.842 [2024-07-23 17:19:16.956473] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:21.842 [2024-07-23 17:19:16.956523] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:21.842 [2024-07-23 17:19:16.956586] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:21.842 [2024-07-23 17:19:16.956597] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d4fe80 name raid_bdev1, state offline 00:25:21.842 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.842 17:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:22.164 17:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:22.164 17:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:22.164 17:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:22.164 17:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:22.732 17:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:22.732 17:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:22.990 17:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:22.990 17:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:25:23.558 17:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:23.558 17:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:25:24.125 17:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:24.125 17:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:24.693 17:19:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:25:24.693 [2024-07-23 17:19:20.096573] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:24.693 [2024-07-23 17:19:20.097967] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:24.693 [2024-07-23 17:19:20.098011] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:25:24.693 [2024-07-23 17:19:20.098045] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:25:24.693 [2024-07-23 17:19:20.098091] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:24.693 [2024-07-23 17:19:20.098131] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:24.693 [2024-07-23 17:19:20.098153] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:25:24.693 [2024-07-23 17:19:20.098175] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:25:24.693 [2024-07-23 17:19:20.098193] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:24.693 [2024-07-23 17:19:20.098202] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c3a090 name raid_bdev1, state configuring 00:25:24.693 request: 00:25:24.693 { 00:25:24.693 "name": "raid_bdev1", 00:25:24.693 "raid_level": "concat", 00:25:24.693 "base_bdevs": [ 00:25:24.693 "malloc1", 00:25:24.693 "malloc2", 00:25:24.693 "malloc3", 00:25:24.693 "malloc4" 00:25:24.693 ], 00:25:24.693 "strip_size_kb": 64, 00:25:24.693 "superblock": false, 00:25:24.693 "method": "bdev_raid_create", 00:25:24.693 "req_id": 1 00:25:24.693 } 00:25:24.693 Got JSON-RPC error response 00:25:24.693 response: 00:25:24.693 { 00:25:24.693 "code": -17, 00:25:24.693 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:24.693 } 00:25:24.951 17:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:25:24.951 17:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:24.951 17:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:24.951 17:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:24.951 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:24.951 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:25.519 [2024-07-23 17:19:20.906633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:25.519 [2024-07-23 17:19:20.906672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:25.519 [2024-07-23 17:19:20.906690] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d543b0 00:25:25.519 [2024-07-23 17:19:20.906702] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:25.519 [2024-07-23 17:19:20.908286] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:25.519 [2024-07-23 17:19:20.908315] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:25.519 [2024-07-23 17:19:20.908376] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:25.519 [2024-07-23 17:19:20.908408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:25.519 pt1 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:25.519 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.520 17:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.087 17:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.087 "name": "raid_bdev1", 00:25:26.087 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:26.087 "strip_size_kb": 64, 00:25:26.087 "state": "configuring", 00:25:26.087 "raid_level": "concat", 00:25:26.087 "superblock": true, 00:25:26.087 "num_base_bdevs": 4, 00:25:26.087 "num_base_bdevs_discovered": 1, 00:25:26.087 "num_base_bdevs_operational": 4, 00:25:26.087 "base_bdevs_list": [ 00:25:26.087 { 00:25:26.087 "name": "pt1", 00:25:26.087 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:26.087 "is_configured": true, 00:25:26.087 "data_offset": 2048, 00:25:26.087 "data_size": 63488 00:25:26.087 }, 00:25:26.087 { 00:25:26.087 "name": null, 00:25:26.087 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:26.087 "is_configured": false, 00:25:26.087 "data_offset": 2048, 00:25:26.087 "data_size": 63488 00:25:26.087 }, 00:25:26.087 { 00:25:26.087 "name": null, 00:25:26.087 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:26.087 "is_configured": false, 00:25:26.087 "data_offset": 2048, 00:25:26.087 "data_size": 63488 00:25:26.087 }, 00:25:26.087 { 00:25:26.087 "name": null, 00:25:26.087 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:26.087 "is_configured": false, 00:25:26.087 "data_offset": 2048, 00:25:26.087 "data_size": 63488 00:25:26.087 } 00:25:26.087 ] 00:25:26.087 }' 00:25:26.087 17:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.087 17:19:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:27.022 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:25:27.022 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:27.280 [2024-07-23 17:19:22.534977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:27.280 [2024-07-23 17:19:22.535022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:27.280 [2024-07-23 17:19:22.535041] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ba32d0 00:25:27.280 [2024-07-23 17:19:22.535053] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:27.280 [2024-07-23 17:19:22.535374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:27.280 [2024-07-23 17:19:22.535390] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:27.280 [2024-07-23 17:19:22.535449] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:27.280 [2024-07-23 17:19:22.535468] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:27.280 pt2 00:25:27.280 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:27.539 [2024-07-23 17:19:22.783646] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.539 17:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.107 17:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:28.107 "name": "raid_bdev1", 00:25:28.107 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:28.107 "strip_size_kb": 64, 00:25:28.107 "state": "configuring", 00:25:28.107 "raid_level": "concat", 00:25:28.107 "superblock": true, 00:25:28.107 "num_base_bdevs": 4, 00:25:28.107 "num_base_bdevs_discovered": 1, 00:25:28.107 "num_base_bdevs_operational": 4, 00:25:28.107 "base_bdevs_list": [ 00:25:28.107 { 00:25:28.107 "name": "pt1", 00:25:28.107 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:28.107 "is_configured": true, 00:25:28.107 "data_offset": 2048, 00:25:28.107 "data_size": 63488 00:25:28.107 }, 00:25:28.107 { 00:25:28.107 "name": null, 00:25:28.107 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:28.107 "is_configured": false, 00:25:28.107 "data_offset": 2048, 00:25:28.107 "data_size": 63488 00:25:28.107 }, 00:25:28.107 { 00:25:28.107 "name": null, 00:25:28.107 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:28.107 "is_configured": false, 00:25:28.107 "data_offset": 2048, 00:25:28.107 "data_size": 63488 00:25:28.107 }, 00:25:28.107 { 00:25:28.107 "name": null, 00:25:28.107 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:28.108 "is_configured": false, 00:25:28.108 "data_offset": 2048, 00:25:28.108 "data_size": 63488 00:25:28.108 } 00:25:28.108 ] 00:25:28.108 }' 00:25:28.108 17:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:28.108 17:19:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:28.677 17:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:28.677 17:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:28.677 17:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:29.245 [2024-07-23 17:19:24.476174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:29.245 [2024-07-23 17:19:24.476240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.245 [2024-07-23 17:19:24.476265] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d551a0 00:25:29.245 [2024-07-23 17:19:24.476278] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.245 [2024-07-23 17:19:24.476708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.245 [2024-07-23 17:19:24.476730] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:29.245 [2024-07-23 17:19:24.476819] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:29.245 [2024-07-23 17:19:24.476844] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:29.245 pt2 00:25:29.245 17:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:29.245 17:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:29.245 17:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:25:29.503 [2024-07-23 17:19:24.736850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:25:29.503 [2024-07-23 17:19:24.736886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.503 [2024-07-23 17:19:24.736911] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ba3890 00:25:29.503 [2024-07-23 17:19:24.736923] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.503 [2024-07-23 17:19:24.737228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.503 [2024-07-23 17:19:24.737246] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:25:29.503 [2024-07-23 17:19:24.737305] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:25:29.503 [2024-07-23 17:19:24.737326] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:25:29.503 pt3 00:25:29.503 17:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:29.503 17:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:29.503 17:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:25:29.762 [2024-07-23 17:19:24.997537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:25:29.762 [2024-07-23 17:19:24.997573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.762 [2024-07-23 17:19:24.997589] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d50f90 00:25:29.762 [2024-07-23 17:19:24.997601] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.762 [2024-07-23 17:19:24.997917] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.762 [2024-07-23 17:19:24.997936] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:25:29.762 [2024-07-23 17:19:24.997989] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:25:29.762 [2024-07-23 17:19:24.998007] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:25:29.762 [2024-07-23 17:19:24.998129] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ba3540 00:25:29.762 [2024-07-23 17:19:24.998140] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:25:29.762 [2024-07-23 17:19:24.998310] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d52170 00:25:29.762 [2024-07-23 17:19:24.998446] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ba3540 00:25:29.762 [2024-07-23 17:19:24.998456] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ba3540 00:25:29.762 [2024-07-23 17:19:24.998555] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.762 pt4 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.762 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.330 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.330 "name": "raid_bdev1", 00:25:30.330 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:30.330 "strip_size_kb": 64, 00:25:30.330 "state": "online", 00:25:30.330 "raid_level": "concat", 00:25:30.330 "superblock": true, 00:25:30.330 "num_base_bdevs": 4, 00:25:30.330 "num_base_bdevs_discovered": 4, 00:25:30.330 "num_base_bdevs_operational": 4, 00:25:30.330 "base_bdevs_list": [ 00:25:30.330 { 00:25:30.330 "name": "pt1", 00:25:30.330 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:30.330 "is_configured": true, 00:25:30.330 "data_offset": 2048, 00:25:30.330 "data_size": 63488 00:25:30.330 }, 00:25:30.330 { 00:25:30.330 "name": "pt2", 00:25:30.330 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:30.330 "is_configured": true, 00:25:30.330 "data_offset": 2048, 00:25:30.330 "data_size": 63488 00:25:30.330 }, 00:25:30.330 { 00:25:30.330 "name": "pt3", 00:25:30.330 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:30.330 "is_configured": true, 00:25:30.330 "data_offset": 2048, 00:25:30.330 "data_size": 63488 00:25:30.330 }, 00:25:30.330 { 00:25:30.330 "name": "pt4", 00:25:30.330 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:30.330 "is_configured": true, 00:25:30.330 "data_offset": 2048, 00:25:30.330 "data_size": 63488 00:25:30.330 } 00:25:30.330 ] 00:25:30.330 }' 00:25:30.330 17:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.330 17:19:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:30.896 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:31.155 [2024-07-23 17:19:26.477791] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:31.155 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:31.155 "name": "raid_bdev1", 00:25:31.155 "aliases": [ 00:25:31.155 "d86fed13-1cf0-4239-8b78-ce9dfaa88837" 00:25:31.155 ], 00:25:31.155 "product_name": "Raid Volume", 00:25:31.155 "block_size": 512, 00:25:31.155 "num_blocks": 253952, 00:25:31.155 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:31.155 "assigned_rate_limits": { 00:25:31.155 "rw_ios_per_sec": 0, 00:25:31.155 "rw_mbytes_per_sec": 0, 00:25:31.155 "r_mbytes_per_sec": 0, 00:25:31.155 "w_mbytes_per_sec": 0 00:25:31.155 }, 00:25:31.155 "claimed": false, 00:25:31.155 "zoned": false, 00:25:31.155 "supported_io_types": { 00:25:31.155 "read": true, 00:25:31.155 "write": true, 00:25:31.155 "unmap": true, 00:25:31.155 "flush": true, 00:25:31.155 "reset": true, 00:25:31.155 "nvme_admin": false, 00:25:31.155 "nvme_io": false, 00:25:31.155 "nvme_io_md": false, 00:25:31.155 "write_zeroes": true, 00:25:31.155 "zcopy": false, 00:25:31.155 "get_zone_info": false, 00:25:31.155 "zone_management": false, 00:25:31.155 "zone_append": false, 00:25:31.155 "compare": false, 00:25:31.155 "compare_and_write": false, 00:25:31.155 "abort": false, 00:25:31.155 "seek_hole": false, 00:25:31.155 "seek_data": false, 00:25:31.155 "copy": false, 00:25:31.155 "nvme_iov_md": false 00:25:31.155 }, 00:25:31.155 "memory_domains": [ 00:25:31.155 { 00:25:31.155 "dma_device_id": "system", 00:25:31.155 "dma_device_type": 1 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:31.155 "dma_device_type": 2 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "system", 00:25:31.155 "dma_device_type": 1 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:31.155 "dma_device_type": 2 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "system", 00:25:31.155 "dma_device_type": 1 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:31.155 "dma_device_type": 2 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "system", 00:25:31.155 "dma_device_type": 1 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:31.155 "dma_device_type": 2 00:25:31.155 } 00:25:31.155 ], 00:25:31.155 "driver_specific": { 00:25:31.155 "raid": { 00:25:31.155 "uuid": "d86fed13-1cf0-4239-8b78-ce9dfaa88837", 00:25:31.155 "strip_size_kb": 64, 00:25:31.155 "state": "online", 00:25:31.155 "raid_level": "concat", 00:25:31.155 "superblock": true, 00:25:31.155 "num_base_bdevs": 4, 00:25:31.155 "num_base_bdevs_discovered": 4, 00:25:31.155 "num_base_bdevs_operational": 4, 00:25:31.155 "base_bdevs_list": [ 00:25:31.155 { 00:25:31.155 "name": "pt1", 00:25:31.155 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:31.155 "is_configured": true, 00:25:31.155 "data_offset": 2048, 00:25:31.155 "data_size": 63488 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "name": "pt2", 00:25:31.155 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:31.155 "is_configured": true, 00:25:31.155 "data_offset": 2048, 00:25:31.155 "data_size": 63488 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "name": "pt3", 00:25:31.155 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:31.155 "is_configured": true, 00:25:31.155 "data_offset": 2048, 00:25:31.155 "data_size": 63488 00:25:31.155 }, 00:25:31.155 { 00:25:31.155 "name": "pt4", 00:25:31.155 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:31.155 "is_configured": true, 00:25:31.155 "data_offset": 2048, 00:25:31.155 "data_size": 63488 00:25:31.155 } 00:25:31.155 ] 00:25:31.155 } 00:25:31.155 } 00:25:31.155 }' 00:25:31.156 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:31.414 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:31.414 pt2 00:25:31.414 pt3 00:25:31.414 pt4' 00:25:31.414 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:31.414 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:31.414 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:31.674 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:31.674 "name": "pt1", 00:25:31.674 "aliases": [ 00:25:31.674 "00000000-0000-0000-0000-000000000001" 00:25:31.674 ], 00:25:31.674 "product_name": "passthru", 00:25:31.674 "block_size": 512, 00:25:31.674 "num_blocks": 65536, 00:25:31.674 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:31.674 "assigned_rate_limits": { 00:25:31.674 "rw_ios_per_sec": 0, 00:25:31.674 "rw_mbytes_per_sec": 0, 00:25:31.674 "r_mbytes_per_sec": 0, 00:25:31.674 "w_mbytes_per_sec": 0 00:25:31.674 }, 00:25:31.674 "claimed": true, 00:25:31.674 "claim_type": "exclusive_write", 00:25:31.674 "zoned": false, 00:25:31.674 "supported_io_types": { 00:25:31.674 "read": true, 00:25:31.674 "write": true, 00:25:31.674 "unmap": true, 00:25:31.674 "flush": true, 00:25:31.674 "reset": true, 00:25:31.674 "nvme_admin": false, 00:25:31.674 "nvme_io": false, 00:25:31.674 "nvme_io_md": false, 00:25:31.674 "write_zeroes": true, 00:25:31.674 "zcopy": true, 00:25:31.674 "get_zone_info": false, 00:25:31.674 "zone_management": false, 00:25:31.674 "zone_append": false, 00:25:31.674 "compare": false, 00:25:31.674 "compare_and_write": false, 00:25:31.674 "abort": true, 00:25:31.674 "seek_hole": false, 00:25:31.674 "seek_data": false, 00:25:31.674 "copy": true, 00:25:31.674 "nvme_iov_md": false 00:25:31.674 }, 00:25:31.674 "memory_domains": [ 00:25:31.674 { 00:25:31.674 "dma_device_id": "system", 00:25:31.674 "dma_device_type": 1 00:25:31.674 }, 00:25:31.674 { 00:25:31.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:31.674 "dma_device_type": 2 00:25:31.674 } 00:25:31.674 ], 00:25:31.674 "driver_specific": { 00:25:31.674 "passthru": { 00:25:31.674 "name": "pt1", 00:25:31.674 "base_bdev_name": "malloc1" 00:25:31.674 } 00:25:31.674 } 00:25:31.674 }' 00:25:31.674 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:31.674 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:31.674 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:31.674 17:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:31.674 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:31.674 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:31.674 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:31.932 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:31.933 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:32.191 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:32.191 "name": "pt2", 00:25:32.191 "aliases": [ 00:25:32.191 "00000000-0000-0000-0000-000000000002" 00:25:32.191 ], 00:25:32.191 "product_name": "passthru", 00:25:32.191 "block_size": 512, 00:25:32.191 "num_blocks": 65536, 00:25:32.191 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:32.191 "assigned_rate_limits": { 00:25:32.191 "rw_ios_per_sec": 0, 00:25:32.191 "rw_mbytes_per_sec": 0, 00:25:32.191 "r_mbytes_per_sec": 0, 00:25:32.191 "w_mbytes_per_sec": 0 00:25:32.191 }, 00:25:32.191 "claimed": true, 00:25:32.191 "claim_type": "exclusive_write", 00:25:32.191 "zoned": false, 00:25:32.191 "supported_io_types": { 00:25:32.191 "read": true, 00:25:32.191 "write": true, 00:25:32.191 "unmap": true, 00:25:32.191 "flush": true, 00:25:32.191 "reset": true, 00:25:32.191 "nvme_admin": false, 00:25:32.191 "nvme_io": false, 00:25:32.191 "nvme_io_md": false, 00:25:32.191 "write_zeroes": true, 00:25:32.191 "zcopy": true, 00:25:32.191 "get_zone_info": false, 00:25:32.191 "zone_management": false, 00:25:32.191 "zone_append": false, 00:25:32.191 "compare": false, 00:25:32.191 "compare_and_write": false, 00:25:32.191 "abort": true, 00:25:32.191 "seek_hole": false, 00:25:32.191 "seek_data": false, 00:25:32.191 "copy": true, 00:25:32.191 "nvme_iov_md": false 00:25:32.191 }, 00:25:32.191 "memory_domains": [ 00:25:32.191 { 00:25:32.191 "dma_device_id": "system", 00:25:32.191 "dma_device_type": 1 00:25:32.191 }, 00:25:32.191 { 00:25:32.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:32.191 "dma_device_type": 2 00:25:32.191 } 00:25:32.191 ], 00:25:32.191 "driver_specific": { 00:25:32.191 "passthru": { 00:25:32.191 "name": "pt2", 00:25:32.191 "base_bdev_name": "malloc2" 00:25:32.191 } 00:25:32.191 } 00:25:32.191 }' 00:25:32.191 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:32.191 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:32.191 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:32.191 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:32.450 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:25:32.709 17:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:32.709 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:32.709 "name": "pt3", 00:25:32.709 "aliases": [ 00:25:32.709 "00000000-0000-0000-0000-000000000003" 00:25:32.709 ], 00:25:32.709 "product_name": "passthru", 00:25:32.709 "block_size": 512, 00:25:32.709 "num_blocks": 65536, 00:25:32.709 "uuid": "00000000-0000-0000-0000-000000000003", 00:25:32.709 "assigned_rate_limits": { 00:25:32.709 "rw_ios_per_sec": 0, 00:25:32.709 "rw_mbytes_per_sec": 0, 00:25:32.709 "r_mbytes_per_sec": 0, 00:25:32.709 "w_mbytes_per_sec": 0 00:25:32.709 }, 00:25:32.709 "claimed": true, 00:25:32.709 "claim_type": "exclusive_write", 00:25:32.709 "zoned": false, 00:25:32.709 "supported_io_types": { 00:25:32.709 "read": true, 00:25:32.709 "write": true, 00:25:32.709 "unmap": true, 00:25:32.709 "flush": true, 00:25:32.709 "reset": true, 00:25:32.709 "nvme_admin": false, 00:25:32.709 "nvme_io": false, 00:25:32.709 "nvme_io_md": false, 00:25:32.709 "write_zeroes": true, 00:25:32.709 "zcopy": true, 00:25:32.709 "get_zone_info": false, 00:25:32.709 "zone_management": false, 00:25:32.709 "zone_append": false, 00:25:32.709 "compare": false, 00:25:32.709 "compare_and_write": false, 00:25:32.709 "abort": true, 00:25:32.709 "seek_hole": false, 00:25:32.709 "seek_data": false, 00:25:32.709 "copy": true, 00:25:32.709 "nvme_iov_md": false 00:25:32.709 }, 00:25:32.709 "memory_domains": [ 00:25:32.709 { 00:25:32.709 "dma_device_id": "system", 00:25:32.709 "dma_device_type": 1 00:25:32.709 }, 00:25:32.709 { 00:25:32.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:32.709 "dma_device_type": 2 00:25:32.709 } 00:25:32.709 ], 00:25:32.709 "driver_specific": { 00:25:32.709 "passthru": { 00:25:32.709 "name": "pt3", 00:25:32.709 "base_bdev_name": "malloc3" 00:25:32.709 } 00:25:32.709 } 00:25:32.709 }' 00:25:32.709 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:32.968 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:25:33.228 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:33.487 "name": "pt4", 00:25:33.487 "aliases": [ 00:25:33.487 "00000000-0000-0000-0000-000000000004" 00:25:33.487 ], 00:25:33.487 "product_name": "passthru", 00:25:33.487 "block_size": 512, 00:25:33.487 "num_blocks": 65536, 00:25:33.487 "uuid": "00000000-0000-0000-0000-000000000004", 00:25:33.487 "assigned_rate_limits": { 00:25:33.487 "rw_ios_per_sec": 0, 00:25:33.487 "rw_mbytes_per_sec": 0, 00:25:33.487 "r_mbytes_per_sec": 0, 00:25:33.487 "w_mbytes_per_sec": 0 00:25:33.487 }, 00:25:33.487 "claimed": true, 00:25:33.487 "claim_type": "exclusive_write", 00:25:33.487 "zoned": false, 00:25:33.487 "supported_io_types": { 00:25:33.487 "read": true, 00:25:33.487 "write": true, 00:25:33.487 "unmap": true, 00:25:33.487 "flush": true, 00:25:33.487 "reset": true, 00:25:33.487 "nvme_admin": false, 00:25:33.487 "nvme_io": false, 00:25:33.487 "nvme_io_md": false, 00:25:33.487 "write_zeroes": true, 00:25:33.487 "zcopy": true, 00:25:33.487 "get_zone_info": false, 00:25:33.487 "zone_management": false, 00:25:33.487 "zone_append": false, 00:25:33.487 "compare": false, 00:25:33.487 "compare_and_write": false, 00:25:33.487 "abort": true, 00:25:33.487 "seek_hole": false, 00:25:33.487 "seek_data": false, 00:25:33.487 "copy": true, 00:25:33.487 "nvme_iov_md": false 00:25:33.487 }, 00:25:33.487 "memory_domains": [ 00:25:33.487 { 00:25:33.487 "dma_device_id": "system", 00:25:33.487 "dma_device_type": 1 00:25:33.487 }, 00:25:33.487 { 00:25:33.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:33.487 "dma_device_type": 2 00:25:33.487 } 00:25:33.487 ], 00:25:33.487 "driver_specific": { 00:25:33.487 "passthru": { 00:25:33.487 "name": "pt4", 00:25:33.487 "base_bdev_name": "malloc4" 00:25:33.487 } 00:25:33.487 } 00:25:33.487 }' 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:33.487 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:33.746 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:33.746 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:33.746 17:19:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:33.746 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:33.746 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:33.746 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:34.006 [2024-07-23 17:19:29.233083] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d86fed13-1cf0-4239-8b78-ce9dfaa88837 '!=' d86fed13-1cf0-4239-8b78-ce9dfaa88837 ']' 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 4512 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 4512 ']' 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 4512 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 4512 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 4512' 00:25:34.006 killing process with pid 4512 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 4512 00:25:34.006 [2024-07-23 17:19:29.304201] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:34.006 [2024-07-23 17:19:29.304281] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:34.006 [2024-07-23 17:19:29.304351] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:34.006 [2024-07-23 17:19:29.304364] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ba3540 name raid_bdev1, state offline 00:25:34.006 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 4512 00:25:34.006 [2024-07-23 17:19:29.393632] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:34.575 17:19:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:25:34.575 00:25:34.575 real 0m21.847s 00:25:34.575 user 0m39.599s 00:25:34.575 sys 0m3.662s 00:25:34.575 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:34.575 17:19:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:25:34.575 ************************************ 00:25:34.575 END TEST raid_superblock_test 00:25:34.575 ************************************ 00:25:34.575 17:19:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:34.575 17:19:29 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:25:34.575 17:19:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:34.575 17:19:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:34.575 17:19:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:34.575 ************************************ 00:25:34.575 START TEST raid_read_error_test 00:25:34.575 ************************************ 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.lhxuSCqQ4V 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=7654 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 7654 /var/tmp/spdk-raid.sock 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 7654 ']' 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:34.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:34.575 17:19:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:34.575 [2024-07-23 17:19:29.962547] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:25:34.575 [2024-07-23 17:19:29.962618] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid7654 ] 00:25:34.834 [2024-07-23 17:19:30.096523] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:34.834 [2024-07-23 17:19:30.151559] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:34.834 [2024-07-23 17:19:30.213698] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:34.834 [2024-07-23 17:19:30.213731] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:35.770 17:19:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:35.770 17:19:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:25:35.770 17:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:35.770 17:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:35.770 BaseBdev1_malloc 00:25:35.770 17:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:25:36.028 true 00:25:36.029 17:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:25:36.287 [2024-07-23 17:19:31.579318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:25:36.287 [2024-07-23 17:19:31.579372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:36.287 [2024-07-23 17:19:31.579392] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8a5c0 00:25:36.287 [2024-07-23 17:19:31.579405] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:36.287 [2024-07-23 17:19:31.580968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:36.287 [2024-07-23 17:19:31.580997] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:36.287 BaseBdev1 00:25:36.287 17:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:36.287 17:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:36.545 BaseBdev2_malloc 00:25:36.545 17:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:25:36.804 true 00:25:36.804 17:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:25:37.062 [2024-07-23 17:19:32.321926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:25:37.062 [2024-07-23 17:19:32.321969] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.062 [2024-07-23 17:19:32.321988] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf84620 00:25:37.062 [2024-07-23 17:19:32.322000] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.062 [2024-07-23 17:19:32.323351] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.062 [2024-07-23 17:19:32.323378] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:37.062 BaseBdev2 00:25:37.062 17:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:37.062 17:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:37.351 BaseBdev3_malloc 00:25:37.351 17:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:25:37.609 true 00:25:37.609 17:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:25:37.867 [2024-07-23 17:19:33.068488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:25:37.867 [2024-07-23 17:19:33.068533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.867 [2024-07-23 17:19:33.068554] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf84c00 00:25:37.867 [2024-07-23 17:19:33.068574] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.867 [2024-07-23 17:19:33.069967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.867 [2024-07-23 17:19:33.069995] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:37.867 BaseBdev3 00:25:37.867 17:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:37.867 17:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:38.125 BaseBdev4_malloc 00:25:38.125 17:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:25:38.381 true 00:25:38.382 17:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:25:38.382 [2024-07-23 17:19:33.795005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:25:38.382 [2024-07-23 17:19:33.795048] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:38.382 [2024-07-23 17:19:33.795068] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf879c0 00:25:38.382 [2024-07-23 17:19:33.795080] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:38.382 [2024-07-23 17:19:33.796504] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:38.382 [2024-07-23 17:19:33.796532] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:38.382 BaseBdev4 00:25:38.639 17:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:25:38.639 [2024-07-23 17:19:34.035681] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:38.639 [2024-07-23 17:19:34.036875] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:38.639 [2024-07-23 17:19:34.036945] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:38.639 [2024-07-23 17:19:34.037005] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:38.639 [2024-07-23 17:19:34.037222] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe72510 00:25:38.639 [2024-07-23 17:19:34.037233] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:25:38.639 [2024-07-23 17:19:34.037412] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd6980 00:25:38.639 [2024-07-23 17:19:34.037555] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe72510 00:25:38.639 [2024-07-23 17:19:34.037565] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe72510 00:25:38.639 [2024-07-23 17:19:34.037658] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.639 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.939 "name": "raid_bdev1", 00:25:38.939 "uuid": "a7cc2281-2aa2-4b1b-a7d6-ad41ff71975c", 00:25:38.939 "strip_size_kb": 64, 00:25:38.939 "state": "online", 00:25:38.939 "raid_level": "concat", 00:25:38.939 "superblock": true, 00:25:38.939 "num_base_bdevs": 4, 00:25:38.939 "num_base_bdevs_discovered": 4, 00:25:38.939 "num_base_bdevs_operational": 4, 00:25:38.939 "base_bdevs_list": [ 00:25:38.939 { 00:25:38.939 "name": "BaseBdev1", 00:25:38.939 "uuid": "4bcb4ebb-2360-5261-8881-0f3031df8df0", 00:25:38.939 "is_configured": true, 00:25:38.939 "data_offset": 2048, 00:25:38.939 "data_size": 63488 00:25:38.939 }, 00:25:38.939 { 00:25:38.939 "name": "BaseBdev2", 00:25:38.939 "uuid": "dba60ebc-fc98-5d54-b99e-4b2d37651e3b", 00:25:38.939 "is_configured": true, 00:25:38.939 "data_offset": 2048, 00:25:38.939 "data_size": 63488 00:25:38.939 }, 00:25:38.939 { 00:25:38.939 "name": "BaseBdev3", 00:25:38.939 "uuid": "7b114135-92ff-5ca9-8a32-a27862d48380", 00:25:38.939 "is_configured": true, 00:25:38.939 "data_offset": 2048, 00:25:38.939 "data_size": 63488 00:25:38.939 }, 00:25:38.939 { 00:25:38.939 "name": "BaseBdev4", 00:25:38.939 "uuid": "abc7c1a8-59ec-5ac3-ad2b-6bac0e08c4e6", 00:25:38.939 "is_configured": true, 00:25:38.939 "data_offset": 2048, 00:25:38.939 "data_size": 63488 00:25:38.939 } 00:25:38.939 ] 00:25:38.939 }' 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.939 17:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:39.504 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:25:39.504 17:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:39.762 [2024-07-23 17:19:35.002498] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe8e280 00:25:40.699 17:19:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.958 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.218 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.218 "name": "raid_bdev1", 00:25:41.218 "uuid": "a7cc2281-2aa2-4b1b-a7d6-ad41ff71975c", 00:25:41.218 "strip_size_kb": 64, 00:25:41.218 "state": "online", 00:25:41.218 "raid_level": "concat", 00:25:41.218 "superblock": true, 00:25:41.218 "num_base_bdevs": 4, 00:25:41.218 "num_base_bdevs_discovered": 4, 00:25:41.218 "num_base_bdevs_operational": 4, 00:25:41.218 "base_bdevs_list": [ 00:25:41.218 { 00:25:41.218 "name": "BaseBdev1", 00:25:41.218 "uuid": "4bcb4ebb-2360-5261-8881-0f3031df8df0", 00:25:41.218 "is_configured": true, 00:25:41.218 "data_offset": 2048, 00:25:41.218 "data_size": 63488 00:25:41.218 }, 00:25:41.218 { 00:25:41.218 "name": "BaseBdev2", 00:25:41.218 "uuid": "dba60ebc-fc98-5d54-b99e-4b2d37651e3b", 00:25:41.218 "is_configured": true, 00:25:41.218 "data_offset": 2048, 00:25:41.218 "data_size": 63488 00:25:41.218 }, 00:25:41.218 { 00:25:41.218 "name": "BaseBdev3", 00:25:41.218 "uuid": "7b114135-92ff-5ca9-8a32-a27862d48380", 00:25:41.218 "is_configured": true, 00:25:41.218 "data_offset": 2048, 00:25:41.218 "data_size": 63488 00:25:41.218 }, 00:25:41.218 { 00:25:41.218 "name": "BaseBdev4", 00:25:41.218 "uuid": "abc7c1a8-59ec-5ac3-ad2b-6bac0e08c4e6", 00:25:41.218 "is_configured": true, 00:25:41.218 "data_offset": 2048, 00:25:41.218 "data_size": 63488 00:25:41.218 } 00:25:41.218 ] 00:25:41.218 }' 00:25:41.218 17:19:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.218 17:19:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:41.786 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:42.045 [2024-07-23 17:19:37.240675] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:42.045 [2024-07-23 17:19:37.240717] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:42.045 [2024-07-23 17:19:37.243880] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:42.045 [2024-07-23 17:19:37.243933] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.045 [2024-07-23 17:19:37.243972] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:42.045 [2024-07-23 17:19:37.243984] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe72510 name raid_bdev1, state offline 00:25:42.045 0 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 7654 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 7654 ']' 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 7654 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 7654 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 7654' 00:25:42.045 killing process with pid 7654 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 7654 00:25:42.045 [2024-07-23 17:19:37.322677] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:42.045 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 7654 00:25:42.045 [2024-07-23 17:19:37.354813] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.lhxuSCqQ4V 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:25:42.304 00:25:42.304 real 0m7.701s 00:25:42.304 user 0m12.313s 00:25:42.304 sys 0m1.389s 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:42.304 17:19:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:42.304 ************************************ 00:25:42.304 END TEST raid_read_error_test 00:25:42.304 ************************************ 00:25:42.304 17:19:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:42.305 17:19:37 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:25:42.305 17:19:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:42.305 17:19:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:42.305 17:19:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:42.305 ************************************ 00:25:42.305 START TEST raid_write_error_test 00:25:42.305 ************************************ 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.erqKSnj0Bn 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=8668 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 8668 /var/tmp/spdk-raid.sock 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 8668 ']' 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:42.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:42.305 17:19:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:42.564 [2024-07-23 17:19:37.759413] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:25:42.564 [2024-07-23 17:19:37.759486] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid8668 ] 00:25:42.564 [2024-07-23 17:19:37.891171] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:42.564 [2024-07-23 17:19:37.942813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:42.822 [2024-07-23 17:19:38.006703] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:42.822 [2024-07-23 17:19:38.006739] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:43.389 17:19:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:43.389 17:19:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:25:43.389 17:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:43.389 17:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:43.646 BaseBdev1_malloc 00:25:43.646 17:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:25:43.904 true 00:25:43.904 17:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:25:44.162 [2024-07-23 17:19:39.425244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:25:44.162 [2024-07-23 17:19:39.425292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:44.162 [2024-07-23 17:19:39.425312] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb85c0 00:25:44.162 [2024-07-23 17:19:39.425324] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:44.162 [2024-07-23 17:19:39.427005] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:44.162 [2024-07-23 17:19:39.427034] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:44.162 BaseBdev1 00:25:44.162 17:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:44.162 17:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:44.420 BaseBdev2_malloc 00:25:44.420 17:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:25:44.679 true 00:25:44.679 17:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:25:44.937 [2024-07-23 17:19:40.163826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:25:44.937 [2024-07-23 17:19:40.163872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:44.937 [2024-07-23 17:19:40.163899] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb2620 00:25:44.937 [2024-07-23 17:19:40.163912] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:44.937 [2024-07-23 17:19:40.165477] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:44.937 [2024-07-23 17:19:40.165510] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:44.937 BaseBdev2 00:25:44.937 17:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:44.937 17:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:45.195 BaseBdev3_malloc 00:25:45.195 17:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:25:45.453 true 00:25:45.453 17:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:25:45.712 [2024-07-23 17:19:40.891536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:25:45.712 [2024-07-23 17:19:40.891580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:45.712 [2024-07-23 17:19:40.891602] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb2c00 00:25:45.712 [2024-07-23 17:19:40.891614] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:45.712 [2024-07-23 17:19:40.893189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:45.712 [2024-07-23 17:19:40.893217] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:45.712 BaseBdev3 00:25:45.712 17:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:25:45.712 17:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:45.970 BaseBdev4_malloc 00:25:45.970 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:25:45.970 true 00:25:45.970 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:25:46.229 [2024-07-23 17:19:41.611278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:25:46.229 [2024-07-23 17:19:41.611323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:46.229 [2024-07-23 17:19:41.611342] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb59c0 00:25:46.229 [2024-07-23 17:19:41.611354] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:46.229 [2024-07-23 17:19:41.612891] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:46.229 [2024-07-23 17:19:41.612926] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:46.229 BaseBdev4 00:25:46.229 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:25:46.488 [2024-07-23 17:19:41.855985] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:46.488 [2024-07-23 17:19:41.857295] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:46.488 [2024-07-23 17:19:41.857359] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:46.488 [2024-07-23 17:19:41.857419] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:46.488 [2024-07-23 17:19:41.857645] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xea0510 00:25:46.488 [2024-07-23 17:19:41.857656] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:25:46.488 [2024-07-23 17:19:41.857848] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe04980 00:25:46.489 [2024-07-23 17:19:41.858008] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xea0510 00:25:46.489 [2024-07-23 17:19:41.858024] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xea0510 00:25:46.489 [2024-07-23 17:19:41.858128] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.489 17:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.747 17:19:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:46.747 "name": "raid_bdev1", 00:25:46.747 "uuid": "45870447-9638-4fa2-91d2-24f05f3c6734", 00:25:46.747 "strip_size_kb": 64, 00:25:46.747 "state": "online", 00:25:46.747 "raid_level": "concat", 00:25:46.747 "superblock": true, 00:25:46.747 "num_base_bdevs": 4, 00:25:46.747 "num_base_bdevs_discovered": 4, 00:25:46.747 "num_base_bdevs_operational": 4, 00:25:46.747 "base_bdevs_list": [ 00:25:46.747 { 00:25:46.747 "name": "BaseBdev1", 00:25:46.747 "uuid": "249d5ca2-b725-562c-86e5-0b8a21ce4cfa", 00:25:46.747 "is_configured": true, 00:25:46.747 "data_offset": 2048, 00:25:46.747 "data_size": 63488 00:25:46.747 }, 00:25:46.747 { 00:25:46.747 "name": "BaseBdev2", 00:25:46.747 "uuid": "373cb626-c638-510d-83d0-407c86cb46ec", 00:25:46.747 "is_configured": true, 00:25:46.747 "data_offset": 2048, 00:25:46.747 "data_size": 63488 00:25:46.747 }, 00:25:46.747 { 00:25:46.747 "name": "BaseBdev3", 00:25:46.747 "uuid": "9b098c76-b82f-5724-aa39-2bcc21b757ed", 00:25:46.747 "is_configured": true, 00:25:46.747 "data_offset": 2048, 00:25:46.747 "data_size": 63488 00:25:46.747 }, 00:25:46.747 { 00:25:46.747 "name": "BaseBdev4", 00:25:46.747 "uuid": "d5e82ea1-d3b4-5824-9a6e-e0d1e987cff8", 00:25:46.747 "is_configured": true, 00:25:46.747 "data_offset": 2048, 00:25:46.747 "data_size": 63488 00:25:46.747 } 00:25:46.748 ] 00:25:46.748 }' 00:25:46.748 17:19:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:46.748 17:19:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:47.683 17:19:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:25:47.683 17:19:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:47.683 [2024-07-23 17:19:42.822771] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xebc280 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.618 17:19:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.877 17:19:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:48.877 "name": "raid_bdev1", 00:25:48.877 "uuid": "45870447-9638-4fa2-91d2-24f05f3c6734", 00:25:48.877 "strip_size_kb": 64, 00:25:48.877 "state": "online", 00:25:48.877 "raid_level": "concat", 00:25:48.877 "superblock": true, 00:25:48.877 "num_base_bdevs": 4, 00:25:48.877 "num_base_bdevs_discovered": 4, 00:25:48.877 "num_base_bdevs_operational": 4, 00:25:48.877 "base_bdevs_list": [ 00:25:48.877 { 00:25:48.877 "name": "BaseBdev1", 00:25:48.877 "uuid": "249d5ca2-b725-562c-86e5-0b8a21ce4cfa", 00:25:48.877 "is_configured": true, 00:25:48.877 "data_offset": 2048, 00:25:48.877 "data_size": 63488 00:25:48.877 }, 00:25:48.877 { 00:25:48.877 "name": "BaseBdev2", 00:25:48.877 "uuid": "373cb626-c638-510d-83d0-407c86cb46ec", 00:25:48.877 "is_configured": true, 00:25:48.877 "data_offset": 2048, 00:25:48.877 "data_size": 63488 00:25:48.877 }, 00:25:48.877 { 00:25:48.877 "name": "BaseBdev3", 00:25:48.877 "uuid": "9b098c76-b82f-5724-aa39-2bcc21b757ed", 00:25:48.877 "is_configured": true, 00:25:48.877 "data_offset": 2048, 00:25:48.877 "data_size": 63488 00:25:48.877 }, 00:25:48.877 { 00:25:48.877 "name": "BaseBdev4", 00:25:48.877 "uuid": "d5e82ea1-d3b4-5824-9a6e-e0d1e987cff8", 00:25:48.877 "is_configured": true, 00:25:48.877 "data_offset": 2048, 00:25:48.877 "data_size": 63488 00:25:48.877 } 00:25:48.877 ] 00:25:48.877 }' 00:25:48.877 17:19:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:48.877 17:19:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:49.445 17:19:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:49.704 [2024-07-23 17:19:44.963268] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:49.704 [2024-07-23 17:19:44.963310] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:49.704 [2024-07-23 17:19:44.966461] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:49.704 [2024-07-23 17:19:44.966504] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:49.704 [2024-07-23 17:19:44.966542] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:49.704 [2024-07-23 17:19:44.966552] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xea0510 name raid_bdev1, state offline 00:25:49.704 0 00:25:49.704 17:19:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 8668 00:25:49.704 17:19:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 8668 ']' 00:25:49.704 17:19:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 8668 00:25:49.704 17:19:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 8668 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 8668' 00:25:49.704 killing process with pid 8668 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 8668 00:25:49.704 [2024-07-23 17:19:45.047990] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:49.704 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 8668 00:25:49.704 [2024-07-23 17:19:45.078518] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.erqKSnj0Bn 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:25:49.964 00:25:49.964 real 0m7.607s 00:25:49.964 user 0m12.186s 00:25:49.964 sys 0m1.319s 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:49.964 17:19:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:25:49.964 ************************************ 00:25:49.964 END TEST raid_write_error_test 00:25:49.964 ************************************ 00:25:49.964 17:19:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:49.964 17:19:45 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:25:49.964 17:19:45 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:25:49.964 17:19:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:49.964 17:19:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:49.964 17:19:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:49.964 ************************************ 00:25:49.964 START TEST raid_state_function_test 00:25:49.964 ************************************ 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=9789 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 9789' 00:25:49.964 Process raid pid: 9789 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 9789 /var/tmp/spdk-raid.sock 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 9789 ']' 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:49.964 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:49.964 17:19:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:50.224 [2024-07-23 17:19:45.443033] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:25:50.224 [2024-07-23 17:19:45.443108] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:50.224 [2024-07-23 17:19:45.578213] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:50.224 [2024-07-23 17:19:45.633224] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:50.483 [2024-07-23 17:19:45.703780] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:50.483 [2024-07-23 17:19:45.703817] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:51.051 17:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:51.051 17:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:25:51.051 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:25:51.310 [2024-07-23 17:19:46.588244] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:51.310 [2024-07-23 17:19:46.588285] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:51.310 [2024-07-23 17:19:46.588295] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:51.310 [2024-07-23 17:19:46.588307] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:51.310 [2024-07-23 17:19:46.588316] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:25:51.310 [2024-07-23 17:19:46.588327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:25:51.310 [2024-07-23 17:19:46.588336] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:25:51.310 [2024-07-23 17:19:46.588347] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.310 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:51.599 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.599 "name": "Existed_Raid", 00:25:51.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.599 "strip_size_kb": 0, 00:25:51.599 "state": "configuring", 00:25:51.599 "raid_level": "raid1", 00:25:51.599 "superblock": false, 00:25:51.599 "num_base_bdevs": 4, 00:25:51.599 "num_base_bdevs_discovered": 0, 00:25:51.599 "num_base_bdevs_operational": 4, 00:25:51.599 "base_bdevs_list": [ 00:25:51.599 { 00:25:51.599 "name": "BaseBdev1", 00:25:51.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.599 "is_configured": false, 00:25:51.599 "data_offset": 0, 00:25:51.599 "data_size": 0 00:25:51.599 }, 00:25:51.599 { 00:25:51.599 "name": "BaseBdev2", 00:25:51.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.599 "is_configured": false, 00:25:51.599 "data_offset": 0, 00:25:51.599 "data_size": 0 00:25:51.599 }, 00:25:51.599 { 00:25:51.599 "name": "BaseBdev3", 00:25:51.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.599 "is_configured": false, 00:25:51.599 "data_offset": 0, 00:25:51.599 "data_size": 0 00:25:51.599 }, 00:25:51.599 { 00:25:51.599 "name": "BaseBdev4", 00:25:51.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.599 "is_configured": false, 00:25:51.599 "data_offset": 0, 00:25:51.599 "data_size": 0 00:25:51.599 } 00:25:51.599 ] 00:25:51.599 }' 00:25:51.600 17:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.600 17:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:52.166 17:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:52.425 [2024-07-23 17:19:47.666969] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:52.425 [2024-07-23 17:19:47.667002] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2652410 name Existed_Raid, state configuring 00:25:52.426 17:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:25:52.685 [2024-07-23 17:19:47.919653] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:52.685 [2024-07-23 17:19:47.919682] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:52.685 [2024-07-23 17:19:47.919691] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:52.685 [2024-07-23 17:19:47.919703] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:52.685 [2024-07-23 17:19:47.919711] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:25:52.685 [2024-07-23 17:19:47.919722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:25:52.685 [2024-07-23 17:19:47.919731] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:25:52.685 [2024-07-23 17:19:47.919747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:25:52.685 17:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:25:52.944 [2024-07-23 17:19:48.178159] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:52.944 BaseBdev1 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:52.944 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:53.203 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:53.462 [ 00:25:53.462 { 00:25:53.462 "name": "BaseBdev1", 00:25:53.462 "aliases": [ 00:25:53.462 "c74f5769-942f-4ec6-ad20-f561b7029c0f" 00:25:53.462 ], 00:25:53.462 "product_name": "Malloc disk", 00:25:53.462 "block_size": 512, 00:25:53.462 "num_blocks": 65536, 00:25:53.462 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:25:53.462 "assigned_rate_limits": { 00:25:53.462 "rw_ios_per_sec": 0, 00:25:53.462 "rw_mbytes_per_sec": 0, 00:25:53.462 "r_mbytes_per_sec": 0, 00:25:53.462 "w_mbytes_per_sec": 0 00:25:53.462 }, 00:25:53.462 "claimed": true, 00:25:53.462 "claim_type": "exclusive_write", 00:25:53.462 "zoned": false, 00:25:53.462 "supported_io_types": { 00:25:53.462 "read": true, 00:25:53.462 "write": true, 00:25:53.462 "unmap": true, 00:25:53.462 "flush": true, 00:25:53.462 "reset": true, 00:25:53.462 "nvme_admin": false, 00:25:53.462 "nvme_io": false, 00:25:53.462 "nvme_io_md": false, 00:25:53.462 "write_zeroes": true, 00:25:53.462 "zcopy": true, 00:25:53.462 "get_zone_info": false, 00:25:53.462 "zone_management": false, 00:25:53.462 "zone_append": false, 00:25:53.462 "compare": false, 00:25:53.462 "compare_and_write": false, 00:25:53.462 "abort": true, 00:25:53.462 "seek_hole": false, 00:25:53.462 "seek_data": false, 00:25:53.462 "copy": true, 00:25:53.462 "nvme_iov_md": false 00:25:53.462 }, 00:25:53.462 "memory_domains": [ 00:25:53.462 { 00:25:53.462 "dma_device_id": "system", 00:25:53.462 "dma_device_type": 1 00:25:53.462 }, 00:25:53.462 { 00:25:53.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:53.462 "dma_device_type": 2 00:25:53.462 } 00:25:53.462 ], 00:25:53.462 "driver_specific": {} 00:25:53.462 } 00:25:53.462 ] 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.462 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:53.721 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.721 "name": "Existed_Raid", 00:25:53.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.722 "strip_size_kb": 0, 00:25:53.722 "state": "configuring", 00:25:53.722 "raid_level": "raid1", 00:25:53.722 "superblock": false, 00:25:53.722 "num_base_bdevs": 4, 00:25:53.722 "num_base_bdevs_discovered": 1, 00:25:53.722 "num_base_bdevs_operational": 4, 00:25:53.722 "base_bdevs_list": [ 00:25:53.722 { 00:25:53.722 "name": "BaseBdev1", 00:25:53.722 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:25:53.722 "is_configured": true, 00:25:53.722 "data_offset": 0, 00:25:53.722 "data_size": 65536 00:25:53.722 }, 00:25:53.722 { 00:25:53.722 "name": "BaseBdev2", 00:25:53.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.722 "is_configured": false, 00:25:53.722 "data_offset": 0, 00:25:53.722 "data_size": 0 00:25:53.722 }, 00:25:53.722 { 00:25:53.722 "name": "BaseBdev3", 00:25:53.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.722 "is_configured": false, 00:25:53.722 "data_offset": 0, 00:25:53.722 "data_size": 0 00:25:53.722 }, 00:25:53.722 { 00:25:53.722 "name": "BaseBdev4", 00:25:53.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.722 "is_configured": false, 00:25:53.722 "data_offset": 0, 00:25:53.722 "data_size": 0 00:25:53.722 } 00:25:53.722 ] 00:25:53.722 }' 00:25:53.722 17:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.722 17:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:54.288 17:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:54.547 [2024-07-23 17:19:49.754332] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:54.547 [2024-07-23 17:19:49.754370] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2651d40 name Existed_Raid, state configuring 00:25:54.547 17:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:25:54.806 [2024-07-23 17:19:50.003026] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:54.806 [2024-07-23 17:19:50.004494] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:54.806 [2024-07-23 17:19:50.004529] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:54.806 [2024-07-23 17:19:50.004540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:25:54.806 [2024-07-23 17:19:50.004553] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:25:54.806 [2024-07-23 17:19:50.004563] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:25:54.806 [2024-07-23 17:19:50.004575] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.806 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:55.065 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.065 "name": "Existed_Raid", 00:25:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.065 "strip_size_kb": 0, 00:25:55.065 "state": "configuring", 00:25:55.065 "raid_level": "raid1", 00:25:55.065 "superblock": false, 00:25:55.065 "num_base_bdevs": 4, 00:25:55.065 "num_base_bdevs_discovered": 1, 00:25:55.065 "num_base_bdevs_operational": 4, 00:25:55.065 "base_bdevs_list": [ 00:25:55.065 { 00:25:55.065 "name": "BaseBdev1", 00:25:55.065 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:25:55.065 "is_configured": true, 00:25:55.065 "data_offset": 0, 00:25:55.065 "data_size": 65536 00:25:55.065 }, 00:25:55.065 { 00:25:55.065 "name": "BaseBdev2", 00:25:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.065 "is_configured": false, 00:25:55.065 "data_offset": 0, 00:25:55.065 "data_size": 0 00:25:55.065 }, 00:25:55.065 { 00:25:55.065 "name": "BaseBdev3", 00:25:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.065 "is_configured": false, 00:25:55.065 "data_offset": 0, 00:25:55.065 "data_size": 0 00:25:55.065 }, 00:25:55.065 { 00:25:55.065 "name": "BaseBdev4", 00:25:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.065 "is_configured": false, 00:25:55.065 "data_offset": 0, 00:25:55.065 "data_size": 0 00:25:55.065 } 00:25:55.065 ] 00:25:55.065 }' 00:25:55.065 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.065 17:19:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:55.633 17:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:25:55.633 [2024-07-23 17:19:51.029190] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:55.633 BaseBdev2 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:55.633 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:55.892 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:56.151 [ 00:25:56.151 { 00:25:56.151 "name": "BaseBdev2", 00:25:56.151 "aliases": [ 00:25:56.151 "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea" 00:25:56.151 ], 00:25:56.151 "product_name": "Malloc disk", 00:25:56.151 "block_size": 512, 00:25:56.151 "num_blocks": 65536, 00:25:56.151 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:25:56.151 "assigned_rate_limits": { 00:25:56.151 "rw_ios_per_sec": 0, 00:25:56.151 "rw_mbytes_per_sec": 0, 00:25:56.151 "r_mbytes_per_sec": 0, 00:25:56.151 "w_mbytes_per_sec": 0 00:25:56.151 }, 00:25:56.151 "claimed": true, 00:25:56.151 "claim_type": "exclusive_write", 00:25:56.151 "zoned": false, 00:25:56.151 "supported_io_types": { 00:25:56.151 "read": true, 00:25:56.151 "write": true, 00:25:56.151 "unmap": true, 00:25:56.151 "flush": true, 00:25:56.151 "reset": true, 00:25:56.151 "nvme_admin": false, 00:25:56.151 "nvme_io": false, 00:25:56.151 "nvme_io_md": false, 00:25:56.151 "write_zeroes": true, 00:25:56.151 "zcopy": true, 00:25:56.151 "get_zone_info": false, 00:25:56.151 "zone_management": false, 00:25:56.151 "zone_append": false, 00:25:56.151 "compare": false, 00:25:56.151 "compare_and_write": false, 00:25:56.151 "abort": true, 00:25:56.151 "seek_hole": false, 00:25:56.151 "seek_data": false, 00:25:56.151 "copy": true, 00:25:56.151 "nvme_iov_md": false 00:25:56.151 }, 00:25:56.151 "memory_domains": [ 00:25:56.151 { 00:25:56.151 "dma_device_id": "system", 00:25:56.151 "dma_device_type": 1 00:25:56.151 }, 00:25:56.151 { 00:25:56.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:56.151 "dma_device_type": 2 00:25:56.151 } 00:25:56.151 ], 00:25:56.151 "driver_specific": {} 00:25:56.151 } 00:25:56.151 ] 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.151 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:56.410 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.410 "name": "Existed_Raid", 00:25:56.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.411 "strip_size_kb": 0, 00:25:56.411 "state": "configuring", 00:25:56.411 "raid_level": "raid1", 00:25:56.411 "superblock": false, 00:25:56.411 "num_base_bdevs": 4, 00:25:56.411 "num_base_bdevs_discovered": 2, 00:25:56.411 "num_base_bdevs_operational": 4, 00:25:56.411 "base_bdevs_list": [ 00:25:56.411 { 00:25:56.411 "name": "BaseBdev1", 00:25:56.411 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:25:56.411 "is_configured": true, 00:25:56.411 "data_offset": 0, 00:25:56.411 "data_size": 65536 00:25:56.411 }, 00:25:56.411 { 00:25:56.411 "name": "BaseBdev2", 00:25:56.411 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:25:56.411 "is_configured": true, 00:25:56.411 "data_offset": 0, 00:25:56.411 "data_size": 65536 00:25:56.411 }, 00:25:56.411 { 00:25:56.411 "name": "BaseBdev3", 00:25:56.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.411 "is_configured": false, 00:25:56.411 "data_offset": 0, 00:25:56.411 "data_size": 0 00:25:56.411 }, 00:25:56.411 { 00:25:56.411 "name": "BaseBdev4", 00:25:56.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.411 "is_configured": false, 00:25:56.411 "data_offset": 0, 00:25:56.411 "data_size": 0 00:25:56.411 } 00:25:56.411 ] 00:25:56.411 }' 00:25:56.411 17:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.411 17:19:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:25:57.347 [2024-07-23 17:19:52.640831] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:57.347 BaseBdev3 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:57.347 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:57.605 17:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:25:57.864 [ 00:25:57.864 { 00:25:57.864 "name": "BaseBdev3", 00:25:57.864 "aliases": [ 00:25:57.864 "7b689230-337f-4809-9d93-70c60ae633f4" 00:25:57.864 ], 00:25:57.864 "product_name": "Malloc disk", 00:25:57.864 "block_size": 512, 00:25:57.864 "num_blocks": 65536, 00:25:57.864 "uuid": "7b689230-337f-4809-9d93-70c60ae633f4", 00:25:57.864 "assigned_rate_limits": { 00:25:57.864 "rw_ios_per_sec": 0, 00:25:57.864 "rw_mbytes_per_sec": 0, 00:25:57.864 "r_mbytes_per_sec": 0, 00:25:57.864 "w_mbytes_per_sec": 0 00:25:57.864 }, 00:25:57.864 "claimed": true, 00:25:57.864 "claim_type": "exclusive_write", 00:25:57.864 "zoned": false, 00:25:57.864 "supported_io_types": { 00:25:57.864 "read": true, 00:25:57.864 "write": true, 00:25:57.864 "unmap": true, 00:25:57.864 "flush": true, 00:25:57.864 "reset": true, 00:25:57.864 "nvme_admin": false, 00:25:57.864 "nvme_io": false, 00:25:57.864 "nvme_io_md": false, 00:25:57.864 "write_zeroes": true, 00:25:57.864 "zcopy": true, 00:25:57.864 "get_zone_info": false, 00:25:57.864 "zone_management": false, 00:25:57.864 "zone_append": false, 00:25:57.864 "compare": false, 00:25:57.864 "compare_and_write": false, 00:25:57.864 "abort": true, 00:25:57.864 "seek_hole": false, 00:25:57.864 "seek_data": false, 00:25:57.864 "copy": true, 00:25:57.864 "nvme_iov_md": false 00:25:57.864 }, 00:25:57.864 "memory_domains": [ 00:25:57.864 { 00:25:57.864 "dma_device_id": "system", 00:25:57.864 "dma_device_type": 1 00:25:57.864 }, 00:25:57.864 { 00:25:57.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:57.864 "dma_device_type": 2 00:25:57.864 } 00:25:57.864 ], 00:25:57.864 "driver_specific": {} 00:25:57.864 } 00:25:57.864 ] 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.864 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:58.122 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.122 "name": "Existed_Raid", 00:25:58.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.122 "strip_size_kb": 0, 00:25:58.122 "state": "configuring", 00:25:58.122 "raid_level": "raid1", 00:25:58.122 "superblock": false, 00:25:58.122 "num_base_bdevs": 4, 00:25:58.122 "num_base_bdevs_discovered": 3, 00:25:58.122 "num_base_bdevs_operational": 4, 00:25:58.122 "base_bdevs_list": [ 00:25:58.122 { 00:25:58.122 "name": "BaseBdev1", 00:25:58.122 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:25:58.122 "is_configured": true, 00:25:58.122 "data_offset": 0, 00:25:58.122 "data_size": 65536 00:25:58.122 }, 00:25:58.122 { 00:25:58.122 "name": "BaseBdev2", 00:25:58.122 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:25:58.122 "is_configured": true, 00:25:58.122 "data_offset": 0, 00:25:58.122 "data_size": 65536 00:25:58.122 }, 00:25:58.122 { 00:25:58.122 "name": "BaseBdev3", 00:25:58.122 "uuid": "7b689230-337f-4809-9d93-70c60ae633f4", 00:25:58.122 "is_configured": true, 00:25:58.122 "data_offset": 0, 00:25:58.122 "data_size": 65536 00:25:58.122 }, 00:25:58.122 { 00:25:58.122 "name": "BaseBdev4", 00:25:58.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.122 "is_configured": false, 00:25:58.122 "data_offset": 0, 00:25:58.122 "data_size": 0 00:25:58.122 } 00:25:58.122 ] 00:25:58.122 }' 00:25:58.122 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.122 17:19:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:25:58.688 17:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:25:58.945 [2024-07-23 17:19:54.220486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:58.946 [2024-07-23 17:19:54.220527] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2651990 00:25:58.946 [2024-07-23 17:19:54.220536] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:58.946 [2024-07-23 17:19:54.220734] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27fc310 00:25:58.946 [2024-07-23 17:19:54.220862] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2651990 00:25:58.946 [2024-07-23 17:19:54.220872] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2651990 00:25:58.946 [2024-07-23 17:19:54.221049] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:58.946 BaseBdev4 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:58.946 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:59.203 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:25:59.461 [ 00:25:59.461 { 00:25:59.461 "name": "BaseBdev4", 00:25:59.461 "aliases": [ 00:25:59.461 "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f" 00:25:59.461 ], 00:25:59.461 "product_name": "Malloc disk", 00:25:59.461 "block_size": 512, 00:25:59.461 "num_blocks": 65536, 00:25:59.461 "uuid": "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f", 00:25:59.461 "assigned_rate_limits": { 00:25:59.461 "rw_ios_per_sec": 0, 00:25:59.461 "rw_mbytes_per_sec": 0, 00:25:59.462 "r_mbytes_per_sec": 0, 00:25:59.462 "w_mbytes_per_sec": 0 00:25:59.462 }, 00:25:59.462 "claimed": true, 00:25:59.462 "claim_type": "exclusive_write", 00:25:59.462 "zoned": false, 00:25:59.462 "supported_io_types": { 00:25:59.462 "read": true, 00:25:59.462 "write": true, 00:25:59.462 "unmap": true, 00:25:59.462 "flush": true, 00:25:59.462 "reset": true, 00:25:59.462 "nvme_admin": false, 00:25:59.462 "nvme_io": false, 00:25:59.462 "nvme_io_md": false, 00:25:59.462 "write_zeroes": true, 00:25:59.462 "zcopy": true, 00:25:59.462 "get_zone_info": false, 00:25:59.462 "zone_management": false, 00:25:59.462 "zone_append": false, 00:25:59.462 "compare": false, 00:25:59.462 "compare_and_write": false, 00:25:59.462 "abort": true, 00:25:59.462 "seek_hole": false, 00:25:59.462 "seek_data": false, 00:25:59.462 "copy": true, 00:25:59.462 "nvme_iov_md": false 00:25:59.462 }, 00:25:59.462 "memory_domains": [ 00:25:59.462 { 00:25:59.462 "dma_device_id": "system", 00:25:59.462 "dma_device_type": 1 00:25:59.462 }, 00:25:59.462 { 00:25:59.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:59.462 "dma_device_type": 2 00:25:59.462 } 00:25:59.462 ], 00:25:59.462 "driver_specific": {} 00:25:59.462 } 00:25:59.462 ] 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:59.462 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.721 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.721 "name": "Existed_Raid", 00:25:59.721 "uuid": "1f6c7dff-ea42-4c53-8c40-a8f238d36028", 00:25:59.721 "strip_size_kb": 0, 00:25:59.721 "state": "online", 00:25:59.721 "raid_level": "raid1", 00:25:59.721 "superblock": false, 00:25:59.721 "num_base_bdevs": 4, 00:25:59.721 "num_base_bdevs_discovered": 4, 00:25:59.721 "num_base_bdevs_operational": 4, 00:25:59.721 "base_bdevs_list": [ 00:25:59.721 { 00:25:59.721 "name": "BaseBdev1", 00:25:59.721 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:25:59.721 "is_configured": true, 00:25:59.721 "data_offset": 0, 00:25:59.721 "data_size": 65536 00:25:59.721 }, 00:25:59.721 { 00:25:59.721 "name": "BaseBdev2", 00:25:59.721 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:25:59.721 "is_configured": true, 00:25:59.721 "data_offset": 0, 00:25:59.721 "data_size": 65536 00:25:59.721 }, 00:25:59.721 { 00:25:59.721 "name": "BaseBdev3", 00:25:59.721 "uuid": "7b689230-337f-4809-9d93-70c60ae633f4", 00:25:59.721 "is_configured": true, 00:25:59.721 "data_offset": 0, 00:25:59.721 "data_size": 65536 00:25:59.721 }, 00:25:59.721 { 00:25:59.721 "name": "BaseBdev4", 00:25:59.721 "uuid": "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f", 00:25:59.721 "is_configured": true, 00:25:59.721 "data_offset": 0, 00:25:59.721 "data_size": 65536 00:25:59.721 } 00:25:59.721 ] 00:25:59.721 }' 00:25:59.721 17:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.721 17:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:00.288 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:00.547 [2024-07-23 17:19:55.821092] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:00.547 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:00.547 "name": "Existed_Raid", 00:26:00.547 "aliases": [ 00:26:00.547 "1f6c7dff-ea42-4c53-8c40-a8f238d36028" 00:26:00.547 ], 00:26:00.547 "product_name": "Raid Volume", 00:26:00.547 "block_size": 512, 00:26:00.547 "num_blocks": 65536, 00:26:00.547 "uuid": "1f6c7dff-ea42-4c53-8c40-a8f238d36028", 00:26:00.547 "assigned_rate_limits": { 00:26:00.547 "rw_ios_per_sec": 0, 00:26:00.547 "rw_mbytes_per_sec": 0, 00:26:00.547 "r_mbytes_per_sec": 0, 00:26:00.547 "w_mbytes_per_sec": 0 00:26:00.547 }, 00:26:00.547 "claimed": false, 00:26:00.547 "zoned": false, 00:26:00.547 "supported_io_types": { 00:26:00.547 "read": true, 00:26:00.547 "write": true, 00:26:00.547 "unmap": false, 00:26:00.547 "flush": false, 00:26:00.547 "reset": true, 00:26:00.547 "nvme_admin": false, 00:26:00.547 "nvme_io": false, 00:26:00.547 "nvme_io_md": false, 00:26:00.547 "write_zeroes": true, 00:26:00.547 "zcopy": false, 00:26:00.547 "get_zone_info": false, 00:26:00.547 "zone_management": false, 00:26:00.547 "zone_append": false, 00:26:00.547 "compare": false, 00:26:00.547 "compare_and_write": false, 00:26:00.547 "abort": false, 00:26:00.547 "seek_hole": false, 00:26:00.547 "seek_data": false, 00:26:00.547 "copy": false, 00:26:00.547 "nvme_iov_md": false 00:26:00.547 }, 00:26:00.547 "memory_domains": [ 00:26:00.547 { 00:26:00.547 "dma_device_id": "system", 00:26:00.547 "dma_device_type": 1 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.547 "dma_device_type": 2 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "system", 00:26:00.547 "dma_device_type": 1 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.547 "dma_device_type": 2 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "system", 00:26:00.547 "dma_device_type": 1 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.547 "dma_device_type": 2 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "system", 00:26:00.547 "dma_device_type": 1 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:00.547 "dma_device_type": 2 00:26:00.547 } 00:26:00.547 ], 00:26:00.547 "driver_specific": { 00:26:00.547 "raid": { 00:26:00.547 "uuid": "1f6c7dff-ea42-4c53-8c40-a8f238d36028", 00:26:00.547 "strip_size_kb": 0, 00:26:00.547 "state": "online", 00:26:00.547 "raid_level": "raid1", 00:26:00.547 "superblock": false, 00:26:00.547 "num_base_bdevs": 4, 00:26:00.547 "num_base_bdevs_discovered": 4, 00:26:00.547 "num_base_bdevs_operational": 4, 00:26:00.547 "base_bdevs_list": [ 00:26:00.547 { 00:26:00.547 "name": "BaseBdev1", 00:26:00.547 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:26:00.547 "is_configured": true, 00:26:00.547 "data_offset": 0, 00:26:00.547 "data_size": 65536 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "name": "BaseBdev2", 00:26:00.547 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:26:00.547 "is_configured": true, 00:26:00.547 "data_offset": 0, 00:26:00.547 "data_size": 65536 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "name": "BaseBdev3", 00:26:00.547 "uuid": "7b689230-337f-4809-9d93-70c60ae633f4", 00:26:00.547 "is_configured": true, 00:26:00.547 "data_offset": 0, 00:26:00.547 "data_size": 65536 00:26:00.547 }, 00:26:00.547 { 00:26:00.547 "name": "BaseBdev4", 00:26:00.547 "uuid": "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f", 00:26:00.547 "is_configured": true, 00:26:00.547 "data_offset": 0, 00:26:00.547 "data_size": 65536 00:26:00.547 } 00:26:00.547 ] 00:26:00.547 } 00:26:00.547 } 00:26:00.547 }' 00:26:00.547 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:00.547 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:00.547 BaseBdev2 00:26:00.547 BaseBdev3 00:26:00.547 BaseBdev4' 00:26:00.548 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:00.548 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:00.548 17:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:01.117 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:01.117 "name": "BaseBdev1", 00:26:01.117 "aliases": [ 00:26:01.117 "c74f5769-942f-4ec6-ad20-f561b7029c0f" 00:26:01.117 ], 00:26:01.117 "product_name": "Malloc disk", 00:26:01.117 "block_size": 512, 00:26:01.117 "num_blocks": 65536, 00:26:01.117 "uuid": "c74f5769-942f-4ec6-ad20-f561b7029c0f", 00:26:01.117 "assigned_rate_limits": { 00:26:01.117 "rw_ios_per_sec": 0, 00:26:01.117 "rw_mbytes_per_sec": 0, 00:26:01.117 "r_mbytes_per_sec": 0, 00:26:01.117 "w_mbytes_per_sec": 0 00:26:01.117 }, 00:26:01.117 "claimed": true, 00:26:01.117 "claim_type": "exclusive_write", 00:26:01.117 "zoned": false, 00:26:01.117 "supported_io_types": { 00:26:01.117 "read": true, 00:26:01.117 "write": true, 00:26:01.117 "unmap": true, 00:26:01.117 "flush": true, 00:26:01.117 "reset": true, 00:26:01.117 "nvme_admin": false, 00:26:01.117 "nvme_io": false, 00:26:01.117 "nvme_io_md": false, 00:26:01.117 "write_zeroes": true, 00:26:01.117 "zcopy": true, 00:26:01.117 "get_zone_info": false, 00:26:01.117 "zone_management": false, 00:26:01.117 "zone_append": false, 00:26:01.117 "compare": false, 00:26:01.117 "compare_and_write": false, 00:26:01.117 "abort": true, 00:26:01.117 "seek_hole": false, 00:26:01.117 "seek_data": false, 00:26:01.117 "copy": true, 00:26:01.117 "nvme_iov_md": false 00:26:01.117 }, 00:26:01.117 "memory_domains": [ 00:26:01.117 { 00:26:01.117 "dma_device_id": "system", 00:26:01.117 "dma_device_type": 1 00:26:01.117 }, 00:26:01.117 { 00:26:01.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:01.117 "dma_device_type": 2 00:26:01.117 } 00:26:01.117 ], 00:26:01.117 "driver_specific": {} 00:26:01.117 }' 00:26:01.117 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:01.117 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:01.117 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:01.117 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:01.375 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:01.633 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:01.633 "name": "BaseBdev2", 00:26:01.633 "aliases": [ 00:26:01.633 "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea" 00:26:01.633 ], 00:26:01.633 "product_name": "Malloc disk", 00:26:01.633 "block_size": 512, 00:26:01.633 "num_blocks": 65536, 00:26:01.633 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:26:01.633 "assigned_rate_limits": { 00:26:01.633 "rw_ios_per_sec": 0, 00:26:01.633 "rw_mbytes_per_sec": 0, 00:26:01.633 "r_mbytes_per_sec": 0, 00:26:01.633 "w_mbytes_per_sec": 0 00:26:01.633 }, 00:26:01.633 "claimed": true, 00:26:01.633 "claim_type": "exclusive_write", 00:26:01.633 "zoned": false, 00:26:01.633 "supported_io_types": { 00:26:01.633 "read": true, 00:26:01.633 "write": true, 00:26:01.633 "unmap": true, 00:26:01.633 "flush": true, 00:26:01.633 "reset": true, 00:26:01.633 "nvme_admin": false, 00:26:01.633 "nvme_io": false, 00:26:01.633 "nvme_io_md": false, 00:26:01.633 "write_zeroes": true, 00:26:01.633 "zcopy": true, 00:26:01.633 "get_zone_info": false, 00:26:01.633 "zone_management": false, 00:26:01.633 "zone_append": false, 00:26:01.633 "compare": false, 00:26:01.633 "compare_and_write": false, 00:26:01.633 "abort": true, 00:26:01.633 "seek_hole": false, 00:26:01.633 "seek_data": false, 00:26:01.633 "copy": true, 00:26:01.633 "nvme_iov_md": false 00:26:01.633 }, 00:26:01.633 "memory_domains": [ 00:26:01.633 { 00:26:01.633 "dma_device_id": "system", 00:26:01.633 "dma_device_type": 1 00:26:01.633 }, 00:26:01.633 { 00:26:01.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:01.633 "dma_device_type": 2 00:26:01.633 } 00:26:01.633 ], 00:26:01.633 "driver_specific": {} 00:26:01.633 }' 00:26:01.633 17:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:01.633 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:01.891 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:02.150 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:02.150 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:02.150 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:02.150 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:02.408 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:02.408 "name": "BaseBdev3", 00:26:02.408 "aliases": [ 00:26:02.408 "7b689230-337f-4809-9d93-70c60ae633f4" 00:26:02.408 ], 00:26:02.408 "product_name": "Malloc disk", 00:26:02.408 "block_size": 512, 00:26:02.408 "num_blocks": 65536, 00:26:02.408 "uuid": "7b689230-337f-4809-9d93-70c60ae633f4", 00:26:02.408 "assigned_rate_limits": { 00:26:02.408 "rw_ios_per_sec": 0, 00:26:02.408 "rw_mbytes_per_sec": 0, 00:26:02.408 "r_mbytes_per_sec": 0, 00:26:02.408 "w_mbytes_per_sec": 0 00:26:02.408 }, 00:26:02.408 "claimed": true, 00:26:02.408 "claim_type": "exclusive_write", 00:26:02.408 "zoned": false, 00:26:02.408 "supported_io_types": { 00:26:02.408 "read": true, 00:26:02.408 "write": true, 00:26:02.408 "unmap": true, 00:26:02.408 "flush": true, 00:26:02.408 "reset": true, 00:26:02.408 "nvme_admin": false, 00:26:02.408 "nvme_io": false, 00:26:02.408 "nvme_io_md": false, 00:26:02.408 "write_zeroes": true, 00:26:02.408 "zcopy": true, 00:26:02.408 "get_zone_info": false, 00:26:02.408 "zone_management": false, 00:26:02.408 "zone_append": false, 00:26:02.408 "compare": false, 00:26:02.408 "compare_and_write": false, 00:26:02.408 "abort": true, 00:26:02.408 "seek_hole": false, 00:26:02.408 "seek_data": false, 00:26:02.409 "copy": true, 00:26:02.409 "nvme_iov_md": false 00:26:02.409 }, 00:26:02.409 "memory_domains": [ 00:26:02.409 { 00:26:02.409 "dma_device_id": "system", 00:26:02.409 "dma_device_type": 1 00:26:02.409 }, 00:26:02.409 { 00:26:02.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.409 "dma_device_type": 2 00:26:02.409 } 00:26:02.409 ], 00:26:02.409 "driver_specific": {} 00:26:02.409 }' 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:02.409 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:26:02.667 17:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:02.925 "name": "BaseBdev4", 00:26:02.925 "aliases": [ 00:26:02.925 "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f" 00:26:02.925 ], 00:26:02.925 "product_name": "Malloc disk", 00:26:02.925 "block_size": 512, 00:26:02.925 "num_blocks": 65536, 00:26:02.925 "uuid": "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f", 00:26:02.925 "assigned_rate_limits": { 00:26:02.925 "rw_ios_per_sec": 0, 00:26:02.925 "rw_mbytes_per_sec": 0, 00:26:02.925 "r_mbytes_per_sec": 0, 00:26:02.925 "w_mbytes_per_sec": 0 00:26:02.925 }, 00:26:02.925 "claimed": true, 00:26:02.925 "claim_type": "exclusive_write", 00:26:02.925 "zoned": false, 00:26:02.925 "supported_io_types": { 00:26:02.925 "read": true, 00:26:02.925 "write": true, 00:26:02.925 "unmap": true, 00:26:02.925 "flush": true, 00:26:02.925 "reset": true, 00:26:02.925 "nvme_admin": false, 00:26:02.925 "nvme_io": false, 00:26:02.925 "nvme_io_md": false, 00:26:02.925 "write_zeroes": true, 00:26:02.925 "zcopy": true, 00:26:02.925 "get_zone_info": false, 00:26:02.925 "zone_management": false, 00:26:02.925 "zone_append": false, 00:26:02.925 "compare": false, 00:26:02.925 "compare_and_write": false, 00:26:02.925 "abort": true, 00:26:02.925 "seek_hole": false, 00:26:02.925 "seek_data": false, 00:26:02.925 "copy": true, 00:26:02.925 "nvme_iov_md": false 00:26:02.925 }, 00:26:02.925 "memory_domains": [ 00:26:02.925 { 00:26:02.925 "dma_device_id": "system", 00:26:02.925 "dma_device_type": 1 00:26:02.925 }, 00:26:02.925 { 00:26:02.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.925 "dma_device_type": 2 00:26:02.925 } 00:26:02.925 ], 00:26:02.925 "driver_specific": {} 00:26:02.925 }' 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:02.925 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.184 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.184 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:03.184 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.184 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.184 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:03.184 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:03.443 [2024-07-23 17:19:58.744577] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.443 17:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:03.702 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.702 "name": "Existed_Raid", 00:26:03.702 "uuid": "1f6c7dff-ea42-4c53-8c40-a8f238d36028", 00:26:03.702 "strip_size_kb": 0, 00:26:03.702 "state": "online", 00:26:03.702 "raid_level": "raid1", 00:26:03.702 "superblock": false, 00:26:03.702 "num_base_bdevs": 4, 00:26:03.702 "num_base_bdevs_discovered": 3, 00:26:03.702 "num_base_bdevs_operational": 3, 00:26:03.702 "base_bdevs_list": [ 00:26:03.702 { 00:26:03.702 "name": null, 00:26:03.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.702 "is_configured": false, 00:26:03.702 "data_offset": 0, 00:26:03.702 "data_size": 65536 00:26:03.702 }, 00:26:03.702 { 00:26:03.702 "name": "BaseBdev2", 00:26:03.702 "uuid": "b985e0b2-50c4-4573-ae6f-c6d37bb1c6ea", 00:26:03.702 "is_configured": true, 00:26:03.702 "data_offset": 0, 00:26:03.702 "data_size": 65536 00:26:03.702 }, 00:26:03.702 { 00:26:03.702 "name": "BaseBdev3", 00:26:03.702 "uuid": "7b689230-337f-4809-9d93-70c60ae633f4", 00:26:03.702 "is_configured": true, 00:26:03.702 "data_offset": 0, 00:26:03.702 "data_size": 65536 00:26:03.702 }, 00:26:03.702 { 00:26:03.702 "name": "BaseBdev4", 00:26:03.702 "uuid": "6718a0d4-6b5e-47d6-a1b0-f3f48a9cf55f", 00:26:03.702 "is_configured": true, 00:26:03.702 "data_offset": 0, 00:26:03.702 "data_size": 65536 00:26:03.702 } 00:26:03.702 ] 00:26:03.702 }' 00:26:03.702 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.702 17:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:04.269 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:04.269 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:04.269 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.269 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:04.527 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:04.527 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:04.527 17:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:05.094 [2024-07-23 17:20:00.377925] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:05.094 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:05.094 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:05.094 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.094 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:05.352 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:05.352 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:05.352 17:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:26:05.954 [2024-07-23 17:20:01.148428] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:26:05.954 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:05.954 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:05.954 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.954 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:06.212 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:06.212 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:06.212 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:26:06.778 [2024-07-23 17:20:01.917038] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:26:06.778 [2024-07-23 17:20:01.917124] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:06.778 [2024-07-23 17:20:01.929757] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:06.778 [2024-07-23 17:20:01.929794] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:06.778 [2024-07-23 17:20:01.929806] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2651990 name Existed_Raid, state offline 00:26:06.778 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:06.778 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:06.778 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.778 17:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:26:07.036 BaseBdev2 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:07.036 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:07.294 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:07.553 [ 00:26:07.553 { 00:26:07.553 "name": "BaseBdev2", 00:26:07.553 "aliases": [ 00:26:07.553 "a2a11f5e-36c9-44a5-bc17-7f054f758c7a" 00:26:07.553 ], 00:26:07.553 "product_name": "Malloc disk", 00:26:07.553 "block_size": 512, 00:26:07.553 "num_blocks": 65536, 00:26:07.553 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:07.553 "assigned_rate_limits": { 00:26:07.553 "rw_ios_per_sec": 0, 00:26:07.553 "rw_mbytes_per_sec": 0, 00:26:07.553 "r_mbytes_per_sec": 0, 00:26:07.553 "w_mbytes_per_sec": 0 00:26:07.553 }, 00:26:07.553 "claimed": false, 00:26:07.553 "zoned": false, 00:26:07.553 "supported_io_types": { 00:26:07.553 "read": true, 00:26:07.553 "write": true, 00:26:07.553 "unmap": true, 00:26:07.553 "flush": true, 00:26:07.553 "reset": true, 00:26:07.553 "nvme_admin": false, 00:26:07.553 "nvme_io": false, 00:26:07.553 "nvme_io_md": false, 00:26:07.553 "write_zeroes": true, 00:26:07.553 "zcopy": true, 00:26:07.553 "get_zone_info": false, 00:26:07.553 "zone_management": false, 00:26:07.553 "zone_append": false, 00:26:07.553 "compare": false, 00:26:07.553 "compare_and_write": false, 00:26:07.553 "abort": true, 00:26:07.553 "seek_hole": false, 00:26:07.553 "seek_data": false, 00:26:07.553 "copy": true, 00:26:07.553 "nvme_iov_md": false 00:26:07.553 }, 00:26:07.553 "memory_domains": [ 00:26:07.553 { 00:26:07.553 "dma_device_id": "system", 00:26:07.553 "dma_device_type": 1 00:26:07.553 }, 00:26:07.553 { 00:26:07.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.553 "dma_device_type": 2 00:26:07.553 } 00:26:07.553 ], 00:26:07.553 "driver_specific": {} 00:26:07.553 } 00:26:07.553 ] 00:26:07.553 17:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:26:07.553 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:07.553 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:07.553 17:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:26:07.812 BaseBdev3 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:07.812 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:08.069 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:26:08.328 [ 00:26:08.328 { 00:26:08.328 "name": "BaseBdev3", 00:26:08.328 "aliases": [ 00:26:08.328 "adc44440-1292-4a28-8536-cbf383132460" 00:26:08.328 ], 00:26:08.328 "product_name": "Malloc disk", 00:26:08.328 "block_size": 512, 00:26:08.328 "num_blocks": 65536, 00:26:08.328 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:08.328 "assigned_rate_limits": { 00:26:08.328 "rw_ios_per_sec": 0, 00:26:08.328 "rw_mbytes_per_sec": 0, 00:26:08.328 "r_mbytes_per_sec": 0, 00:26:08.328 "w_mbytes_per_sec": 0 00:26:08.328 }, 00:26:08.328 "claimed": false, 00:26:08.328 "zoned": false, 00:26:08.328 "supported_io_types": { 00:26:08.328 "read": true, 00:26:08.328 "write": true, 00:26:08.328 "unmap": true, 00:26:08.328 "flush": true, 00:26:08.328 "reset": true, 00:26:08.328 "nvme_admin": false, 00:26:08.328 "nvme_io": false, 00:26:08.328 "nvme_io_md": false, 00:26:08.328 "write_zeroes": true, 00:26:08.328 "zcopy": true, 00:26:08.328 "get_zone_info": false, 00:26:08.328 "zone_management": false, 00:26:08.328 "zone_append": false, 00:26:08.328 "compare": false, 00:26:08.328 "compare_and_write": false, 00:26:08.328 "abort": true, 00:26:08.328 "seek_hole": false, 00:26:08.328 "seek_data": false, 00:26:08.328 "copy": true, 00:26:08.328 "nvme_iov_md": false 00:26:08.328 }, 00:26:08.328 "memory_domains": [ 00:26:08.328 { 00:26:08.328 "dma_device_id": "system", 00:26:08.328 "dma_device_type": 1 00:26:08.328 }, 00:26:08.328 { 00:26:08.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:08.328 "dma_device_type": 2 00:26:08.328 } 00:26:08.328 ], 00:26:08.328 "driver_specific": {} 00:26:08.328 } 00:26:08.328 ] 00:26:08.328 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:26:08.328 17:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:08.328 17:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:08.328 17:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:26:08.587 BaseBdev4 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:08.587 17:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:08.845 17:20:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:26:09.105 [ 00:26:09.105 { 00:26:09.105 "name": "BaseBdev4", 00:26:09.105 "aliases": [ 00:26:09.105 "f6b1c274-e556-4df8-9df4-1f3017fa0767" 00:26:09.105 ], 00:26:09.105 "product_name": "Malloc disk", 00:26:09.105 "block_size": 512, 00:26:09.105 "num_blocks": 65536, 00:26:09.105 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:09.105 "assigned_rate_limits": { 00:26:09.105 "rw_ios_per_sec": 0, 00:26:09.105 "rw_mbytes_per_sec": 0, 00:26:09.105 "r_mbytes_per_sec": 0, 00:26:09.105 "w_mbytes_per_sec": 0 00:26:09.105 }, 00:26:09.105 "claimed": false, 00:26:09.105 "zoned": false, 00:26:09.105 "supported_io_types": { 00:26:09.105 "read": true, 00:26:09.105 "write": true, 00:26:09.105 "unmap": true, 00:26:09.105 "flush": true, 00:26:09.105 "reset": true, 00:26:09.105 "nvme_admin": false, 00:26:09.105 "nvme_io": false, 00:26:09.105 "nvme_io_md": false, 00:26:09.105 "write_zeroes": true, 00:26:09.105 "zcopy": true, 00:26:09.105 "get_zone_info": false, 00:26:09.105 "zone_management": false, 00:26:09.105 "zone_append": false, 00:26:09.105 "compare": false, 00:26:09.105 "compare_and_write": false, 00:26:09.105 "abort": true, 00:26:09.105 "seek_hole": false, 00:26:09.105 "seek_data": false, 00:26:09.105 "copy": true, 00:26:09.105 "nvme_iov_md": false 00:26:09.105 }, 00:26:09.105 "memory_domains": [ 00:26:09.105 { 00:26:09.105 "dma_device_id": "system", 00:26:09.105 "dma_device_type": 1 00:26:09.105 }, 00:26:09.105 { 00:26:09.105 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:09.105 "dma_device_type": 2 00:26:09.105 } 00:26:09.105 ], 00:26:09.105 "driver_specific": {} 00:26:09.105 } 00:26:09.105 ] 00:26:09.105 17:20:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:26:09.105 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:09.105 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:09.105 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:26:09.365 [2024-07-23 17:20:04.631130] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:09.365 [2024-07-23 17:20:04.631171] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:09.365 [2024-07-23 17:20:04.631192] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:09.365 [2024-07-23 17:20:04.632513] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:09.365 [2024-07-23 17:20:04.632554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.365 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:09.624 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:09.624 "name": "Existed_Raid", 00:26:09.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.624 "strip_size_kb": 0, 00:26:09.624 "state": "configuring", 00:26:09.624 "raid_level": "raid1", 00:26:09.624 "superblock": false, 00:26:09.624 "num_base_bdevs": 4, 00:26:09.624 "num_base_bdevs_discovered": 3, 00:26:09.624 "num_base_bdevs_operational": 4, 00:26:09.624 "base_bdevs_list": [ 00:26:09.624 { 00:26:09.624 "name": "BaseBdev1", 00:26:09.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.624 "is_configured": false, 00:26:09.624 "data_offset": 0, 00:26:09.624 "data_size": 0 00:26:09.624 }, 00:26:09.624 { 00:26:09.624 "name": "BaseBdev2", 00:26:09.624 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:09.624 "is_configured": true, 00:26:09.624 "data_offset": 0, 00:26:09.624 "data_size": 65536 00:26:09.624 }, 00:26:09.624 { 00:26:09.624 "name": "BaseBdev3", 00:26:09.624 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:09.624 "is_configured": true, 00:26:09.624 "data_offset": 0, 00:26:09.624 "data_size": 65536 00:26:09.624 }, 00:26:09.624 { 00:26:09.624 "name": "BaseBdev4", 00:26:09.624 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:09.624 "is_configured": true, 00:26:09.624 "data_offset": 0, 00:26:09.624 "data_size": 65536 00:26:09.624 } 00:26:09.624 ] 00:26:09.624 }' 00:26:09.624 17:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:09.624 17:20:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:10.192 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:10.451 [2024-07-23 17:20:05.701953] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.451 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:10.709 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:10.709 "name": "Existed_Raid", 00:26:10.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.709 "strip_size_kb": 0, 00:26:10.709 "state": "configuring", 00:26:10.709 "raid_level": "raid1", 00:26:10.709 "superblock": false, 00:26:10.709 "num_base_bdevs": 4, 00:26:10.709 "num_base_bdevs_discovered": 2, 00:26:10.709 "num_base_bdevs_operational": 4, 00:26:10.709 "base_bdevs_list": [ 00:26:10.709 { 00:26:10.709 "name": "BaseBdev1", 00:26:10.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.709 "is_configured": false, 00:26:10.709 "data_offset": 0, 00:26:10.709 "data_size": 0 00:26:10.709 }, 00:26:10.709 { 00:26:10.709 "name": null, 00:26:10.709 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:10.709 "is_configured": false, 00:26:10.709 "data_offset": 0, 00:26:10.709 "data_size": 65536 00:26:10.709 }, 00:26:10.709 { 00:26:10.709 "name": "BaseBdev3", 00:26:10.709 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:10.709 "is_configured": true, 00:26:10.709 "data_offset": 0, 00:26:10.709 "data_size": 65536 00:26:10.709 }, 00:26:10.709 { 00:26:10.709 "name": "BaseBdev4", 00:26:10.709 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:10.709 "is_configured": true, 00:26:10.709 "data_offset": 0, 00:26:10.709 "data_size": 65536 00:26:10.709 } 00:26:10.709 ] 00:26:10.709 }' 00:26:10.709 17:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:10.709 17:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:11.275 17:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.275 17:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:11.533 17:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:26:11.533 17:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:26:11.792 [2024-07-23 17:20:07.052847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:11.792 BaseBdev1 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:11.792 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:12.050 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:12.309 [ 00:26:12.309 { 00:26:12.309 "name": "BaseBdev1", 00:26:12.309 "aliases": [ 00:26:12.309 "7c8b176e-10b2-406d-926c-99b63051fd12" 00:26:12.309 ], 00:26:12.309 "product_name": "Malloc disk", 00:26:12.309 "block_size": 512, 00:26:12.309 "num_blocks": 65536, 00:26:12.309 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:12.309 "assigned_rate_limits": { 00:26:12.309 "rw_ios_per_sec": 0, 00:26:12.309 "rw_mbytes_per_sec": 0, 00:26:12.309 "r_mbytes_per_sec": 0, 00:26:12.309 "w_mbytes_per_sec": 0 00:26:12.309 }, 00:26:12.309 "claimed": true, 00:26:12.309 "claim_type": "exclusive_write", 00:26:12.309 "zoned": false, 00:26:12.309 "supported_io_types": { 00:26:12.309 "read": true, 00:26:12.309 "write": true, 00:26:12.309 "unmap": true, 00:26:12.309 "flush": true, 00:26:12.309 "reset": true, 00:26:12.309 "nvme_admin": false, 00:26:12.309 "nvme_io": false, 00:26:12.309 "nvme_io_md": false, 00:26:12.309 "write_zeroes": true, 00:26:12.309 "zcopy": true, 00:26:12.309 "get_zone_info": false, 00:26:12.309 "zone_management": false, 00:26:12.309 "zone_append": false, 00:26:12.309 "compare": false, 00:26:12.309 "compare_and_write": false, 00:26:12.309 "abort": true, 00:26:12.309 "seek_hole": false, 00:26:12.309 "seek_data": false, 00:26:12.309 "copy": true, 00:26:12.309 "nvme_iov_md": false 00:26:12.309 }, 00:26:12.309 "memory_domains": [ 00:26:12.309 { 00:26:12.309 "dma_device_id": "system", 00:26:12.309 "dma_device_type": 1 00:26:12.309 }, 00:26:12.309 { 00:26:12.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.309 "dma_device_type": 2 00:26:12.309 } 00:26:12.309 ], 00:26:12.309 "driver_specific": {} 00:26:12.309 } 00:26:12.309 ] 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.309 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:12.568 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.568 "name": "Existed_Raid", 00:26:12.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.568 "strip_size_kb": 0, 00:26:12.568 "state": "configuring", 00:26:12.568 "raid_level": "raid1", 00:26:12.568 "superblock": false, 00:26:12.568 "num_base_bdevs": 4, 00:26:12.568 "num_base_bdevs_discovered": 3, 00:26:12.568 "num_base_bdevs_operational": 4, 00:26:12.568 "base_bdevs_list": [ 00:26:12.568 { 00:26:12.568 "name": "BaseBdev1", 00:26:12.568 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:12.568 "is_configured": true, 00:26:12.568 "data_offset": 0, 00:26:12.568 "data_size": 65536 00:26:12.568 }, 00:26:12.568 { 00:26:12.568 "name": null, 00:26:12.568 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:12.568 "is_configured": false, 00:26:12.568 "data_offset": 0, 00:26:12.568 "data_size": 65536 00:26:12.568 }, 00:26:12.568 { 00:26:12.568 "name": "BaseBdev3", 00:26:12.568 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:12.568 "is_configured": true, 00:26:12.568 "data_offset": 0, 00:26:12.568 "data_size": 65536 00:26:12.568 }, 00:26:12.568 { 00:26:12.568 "name": "BaseBdev4", 00:26:12.568 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:12.568 "is_configured": true, 00:26:12.568 "data_offset": 0, 00:26:12.568 "data_size": 65536 00:26:12.568 } 00:26:12.568 ] 00:26:12.568 }' 00:26:12.568 17:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.568 17:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:13.135 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.135 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:26:13.394 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:26:13.394 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:26:13.652 [2024-07-23 17:20:08.881724] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.652 17:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:13.911 17:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.911 "name": "Existed_Raid", 00:26:13.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.911 "strip_size_kb": 0, 00:26:13.911 "state": "configuring", 00:26:13.911 "raid_level": "raid1", 00:26:13.911 "superblock": false, 00:26:13.911 "num_base_bdevs": 4, 00:26:13.911 "num_base_bdevs_discovered": 2, 00:26:13.911 "num_base_bdevs_operational": 4, 00:26:13.911 "base_bdevs_list": [ 00:26:13.911 { 00:26:13.911 "name": "BaseBdev1", 00:26:13.911 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:13.911 "is_configured": true, 00:26:13.911 "data_offset": 0, 00:26:13.911 "data_size": 65536 00:26:13.911 }, 00:26:13.911 { 00:26:13.911 "name": null, 00:26:13.911 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:13.911 "is_configured": false, 00:26:13.911 "data_offset": 0, 00:26:13.911 "data_size": 65536 00:26:13.911 }, 00:26:13.911 { 00:26:13.911 "name": null, 00:26:13.911 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:13.911 "is_configured": false, 00:26:13.911 "data_offset": 0, 00:26:13.911 "data_size": 65536 00:26:13.911 }, 00:26:13.911 { 00:26:13.911 "name": "BaseBdev4", 00:26:13.911 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:13.911 "is_configured": true, 00:26:13.911 "data_offset": 0, 00:26:13.911 "data_size": 65536 00:26:13.911 } 00:26:13.911 ] 00:26:13.911 }' 00:26:13.911 17:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.911 17:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:14.478 17:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.478 17:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:26:14.736 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:26:14.736 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:26:14.994 [2024-07-23 17:20:10.229348] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.994 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:15.253 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.253 "name": "Existed_Raid", 00:26:15.253 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.253 "strip_size_kb": 0, 00:26:15.253 "state": "configuring", 00:26:15.253 "raid_level": "raid1", 00:26:15.253 "superblock": false, 00:26:15.253 "num_base_bdevs": 4, 00:26:15.253 "num_base_bdevs_discovered": 3, 00:26:15.253 "num_base_bdevs_operational": 4, 00:26:15.253 "base_bdevs_list": [ 00:26:15.253 { 00:26:15.253 "name": "BaseBdev1", 00:26:15.253 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:15.253 "is_configured": true, 00:26:15.253 "data_offset": 0, 00:26:15.253 "data_size": 65536 00:26:15.253 }, 00:26:15.253 { 00:26:15.253 "name": null, 00:26:15.253 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:15.253 "is_configured": false, 00:26:15.253 "data_offset": 0, 00:26:15.253 "data_size": 65536 00:26:15.253 }, 00:26:15.253 { 00:26:15.253 "name": "BaseBdev3", 00:26:15.253 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:15.253 "is_configured": true, 00:26:15.253 "data_offset": 0, 00:26:15.253 "data_size": 65536 00:26:15.253 }, 00:26:15.253 { 00:26:15.253 "name": "BaseBdev4", 00:26:15.253 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:15.253 "is_configured": true, 00:26:15.253 "data_offset": 0, 00:26:15.253 "data_size": 65536 00:26:15.253 } 00:26:15.253 ] 00:26:15.253 }' 00:26:15.253 17:20:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.253 17:20:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:15.820 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.820 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:26:16.387 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:26:16.387 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:16.646 [2024-07-23 17:20:11.885750] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:16.646 17:20:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.905 17:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.905 "name": "Existed_Raid", 00:26:16.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.905 "strip_size_kb": 0, 00:26:16.905 "state": "configuring", 00:26:16.905 "raid_level": "raid1", 00:26:16.905 "superblock": false, 00:26:16.905 "num_base_bdevs": 4, 00:26:16.905 "num_base_bdevs_discovered": 2, 00:26:16.905 "num_base_bdevs_operational": 4, 00:26:16.905 "base_bdevs_list": [ 00:26:16.905 { 00:26:16.905 "name": null, 00:26:16.905 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:16.905 "is_configured": false, 00:26:16.905 "data_offset": 0, 00:26:16.905 "data_size": 65536 00:26:16.905 }, 00:26:16.905 { 00:26:16.905 "name": null, 00:26:16.905 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:16.905 "is_configured": false, 00:26:16.905 "data_offset": 0, 00:26:16.905 "data_size": 65536 00:26:16.905 }, 00:26:16.905 { 00:26:16.905 "name": "BaseBdev3", 00:26:16.905 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:16.905 "is_configured": true, 00:26:16.905 "data_offset": 0, 00:26:16.905 "data_size": 65536 00:26:16.905 }, 00:26:16.905 { 00:26:16.905 "name": "BaseBdev4", 00:26:16.905 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:16.905 "is_configured": true, 00:26:16.905 "data_offset": 0, 00:26:16.905 "data_size": 65536 00:26:16.905 } 00:26:16.905 ] 00:26:16.905 }' 00:26:16.905 17:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.905 17:20:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:17.472 17:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:26:17.472 17:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.731 17:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:26:17.731 17:20:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:26:17.990 [2024-07-23 17:20:13.153726] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:17.990 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.248 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:18.248 "name": "Existed_Raid", 00:26:18.248 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.248 "strip_size_kb": 0, 00:26:18.248 "state": "configuring", 00:26:18.248 "raid_level": "raid1", 00:26:18.248 "superblock": false, 00:26:18.248 "num_base_bdevs": 4, 00:26:18.248 "num_base_bdevs_discovered": 3, 00:26:18.248 "num_base_bdevs_operational": 4, 00:26:18.248 "base_bdevs_list": [ 00:26:18.248 { 00:26:18.248 "name": null, 00:26:18.248 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:18.248 "is_configured": false, 00:26:18.248 "data_offset": 0, 00:26:18.248 "data_size": 65536 00:26:18.248 }, 00:26:18.248 { 00:26:18.248 "name": "BaseBdev2", 00:26:18.248 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:18.248 "is_configured": true, 00:26:18.248 "data_offset": 0, 00:26:18.248 "data_size": 65536 00:26:18.248 }, 00:26:18.248 { 00:26:18.248 "name": "BaseBdev3", 00:26:18.248 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:18.248 "is_configured": true, 00:26:18.248 "data_offset": 0, 00:26:18.248 "data_size": 65536 00:26:18.248 }, 00:26:18.248 { 00:26:18.248 "name": "BaseBdev4", 00:26:18.248 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:18.248 "is_configured": true, 00:26:18.248 "data_offset": 0, 00:26:18.248 "data_size": 65536 00:26:18.248 } 00:26:18.248 ] 00:26:18.248 }' 00:26:18.248 17:20:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:18.248 17:20:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:19.183 17:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.183 17:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:19.183 17:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:26:19.183 17:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:26:19.183 17:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.442 17:20:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7c8b176e-10b2-406d-926c-99b63051fd12 00:26:20.010 [2024-07-23 17:20:15.263840] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:26:20.010 [2024-07-23 17:20:15.263882] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2652ab0 00:26:20.010 [2024-07-23 17:20:15.263891] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:20.010 [2024-07-23 17:20:15.264109] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27fcd00 00:26:20.010 [2024-07-23 17:20:15.264233] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2652ab0 00:26:20.010 [2024-07-23 17:20:15.264243] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2652ab0 00:26:20.010 [2024-07-23 17:20:15.264415] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:20.010 NewBaseBdev 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:20.010 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:20.312 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:26:20.594 [ 00:26:20.594 { 00:26:20.594 "name": "NewBaseBdev", 00:26:20.594 "aliases": [ 00:26:20.594 "7c8b176e-10b2-406d-926c-99b63051fd12" 00:26:20.594 ], 00:26:20.594 "product_name": "Malloc disk", 00:26:20.594 "block_size": 512, 00:26:20.594 "num_blocks": 65536, 00:26:20.594 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:20.594 "assigned_rate_limits": { 00:26:20.594 "rw_ios_per_sec": 0, 00:26:20.594 "rw_mbytes_per_sec": 0, 00:26:20.594 "r_mbytes_per_sec": 0, 00:26:20.594 "w_mbytes_per_sec": 0 00:26:20.594 }, 00:26:20.594 "claimed": true, 00:26:20.594 "claim_type": "exclusive_write", 00:26:20.594 "zoned": false, 00:26:20.594 "supported_io_types": { 00:26:20.594 "read": true, 00:26:20.594 "write": true, 00:26:20.594 "unmap": true, 00:26:20.594 "flush": true, 00:26:20.594 "reset": true, 00:26:20.594 "nvme_admin": false, 00:26:20.594 "nvme_io": false, 00:26:20.594 "nvme_io_md": false, 00:26:20.594 "write_zeroes": true, 00:26:20.594 "zcopy": true, 00:26:20.594 "get_zone_info": false, 00:26:20.594 "zone_management": false, 00:26:20.594 "zone_append": false, 00:26:20.594 "compare": false, 00:26:20.594 "compare_and_write": false, 00:26:20.594 "abort": true, 00:26:20.594 "seek_hole": false, 00:26:20.594 "seek_data": false, 00:26:20.594 "copy": true, 00:26:20.594 "nvme_iov_md": false 00:26:20.594 }, 00:26:20.594 "memory_domains": [ 00:26:20.594 { 00:26:20.594 "dma_device_id": "system", 00:26:20.594 "dma_device_type": 1 00:26:20.594 }, 00:26:20.594 { 00:26:20.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:20.594 "dma_device_type": 2 00:26:20.594 } 00:26:20.594 ], 00:26:20.594 "driver_specific": {} 00:26:20.594 } 00:26:20.594 ] 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.594 17:20:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:20.853 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:20.853 "name": "Existed_Raid", 00:26:20.853 "uuid": "8bf7ace5-1746-4650-9fa6-09c1f1da97f3", 00:26:20.853 "strip_size_kb": 0, 00:26:20.853 "state": "online", 00:26:20.853 "raid_level": "raid1", 00:26:20.853 "superblock": false, 00:26:20.853 "num_base_bdevs": 4, 00:26:20.853 "num_base_bdevs_discovered": 4, 00:26:20.853 "num_base_bdevs_operational": 4, 00:26:20.853 "base_bdevs_list": [ 00:26:20.853 { 00:26:20.853 "name": "NewBaseBdev", 00:26:20.854 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:20.854 "is_configured": true, 00:26:20.854 "data_offset": 0, 00:26:20.854 "data_size": 65536 00:26:20.854 }, 00:26:20.854 { 00:26:20.854 "name": "BaseBdev2", 00:26:20.854 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:20.854 "is_configured": true, 00:26:20.854 "data_offset": 0, 00:26:20.854 "data_size": 65536 00:26:20.854 }, 00:26:20.854 { 00:26:20.854 "name": "BaseBdev3", 00:26:20.854 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:20.854 "is_configured": true, 00:26:20.854 "data_offset": 0, 00:26:20.854 "data_size": 65536 00:26:20.854 }, 00:26:20.854 { 00:26:20.854 "name": "BaseBdev4", 00:26:20.854 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:20.854 "is_configured": true, 00:26:20.854 "data_offset": 0, 00:26:20.854 "data_size": 65536 00:26:20.854 } 00:26:20.854 ] 00:26:20.854 }' 00:26:20.854 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:20.854 17:20:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:21.790 17:20:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:21.790 [2024-07-23 17:20:17.157217] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:21.790 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:21.790 "name": "Existed_Raid", 00:26:21.790 "aliases": [ 00:26:21.790 "8bf7ace5-1746-4650-9fa6-09c1f1da97f3" 00:26:21.790 ], 00:26:21.790 "product_name": "Raid Volume", 00:26:21.790 "block_size": 512, 00:26:21.790 "num_blocks": 65536, 00:26:21.790 "uuid": "8bf7ace5-1746-4650-9fa6-09c1f1da97f3", 00:26:21.790 "assigned_rate_limits": { 00:26:21.790 "rw_ios_per_sec": 0, 00:26:21.790 "rw_mbytes_per_sec": 0, 00:26:21.790 "r_mbytes_per_sec": 0, 00:26:21.790 "w_mbytes_per_sec": 0 00:26:21.790 }, 00:26:21.790 "claimed": false, 00:26:21.790 "zoned": false, 00:26:21.790 "supported_io_types": { 00:26:21.790 "read": true, 00:26:21.790 "write": true, 00:26:21.790 "unmap": false, 00:26:21.790 "flush": false, 00:26:21.790 "reset": true, 00:26:21.790 "nvme_admin": false, 00:26:21.790 "nvme_io": false, 00:26:21.790 "nvme_io_md": false, 00:26:21.790 "write_zeroes": true, 00:26:21.790 "zcopy": false, 00:26:21.790 "get_zone_info": false, 00:26:21.790 "zone_management": false, 00:26:21.790 "zone_append": false, 00:26:21.790 "compare": false, 00:26:21.790 "compare_and_write": false, 00:26:21.790 "abort": false, 00:26:21.790 "seek_hole": false, 00:26:21.790 "seek_data": false, 00:26:21.790 "copy": false, 00:26:21.790 "nvme_iov_md": false 00:26:21.790 }, 00:26:21.790 "memory_domains": [ 00:26:21.790 { 00:26:21.790 "dma_device_id": "system", 00:26:21.790 "dma_device_type": 1 00:26:21.790 }, 00:26:21.790 { 00:26:21.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.790 "dma_device_type": 2 00:26:21.790 }, 00:26:21.790 { 00:26:21.790 "dma_device_id": "system", 00:26:21.791 "dma_device_type": 1 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.791 "dma_device_type": 2 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "dma_device_id": "system", 00:26:21.791 "dma_device_type": 1 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.791 "dma_device_type": 2 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "dma_device_id": "system", 00:26:21.791 "dma_device_type": 1 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:21.791 "dma_device_type": 2 00:26:21.791 } 00:26:21.791 ], 00:26:21.791 "driver_specific": { 00:26:21.791 "raid": { 00:26:21.791 "uuid": "8bf7ace5-1746-4650-9fa6-09c1f1da97f3", 00:26:21.791 "strip_size_kb": 0, 00:26:21.791 "state": "online", 00:26:21.791 "raid_level": "raid1", 00:26:21.791 "superblock": false, 00:26:21.791 "num_base_bdevs": 4, 00:26:21.791 "num_base_bdevs_discovered": 4, 00:26:21.791 "num_base_bdevs_operational": 4, 00:26:21.791 "base_bdevs_list": [ 00:26:21.791 { 00:26:21.791 "name": "NewBaseBdev", 00:26:21.791 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:21.791 "is_configured": true, 00:26:21.791 "data_offset": 0, 00:26:21.791 "data_size": 65536 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "name": "BaseBdev2", 00:26:21.791 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:21.791 "is_configured": true, 00:26:21.791 "data_offset": 0, 00:26:21.791 "data_size": 65536 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "name": "BaseBdev3", 00:26:21.791 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:21.791 "is_configured": true, 00:26:21.791 "data_offset": 0, 00:26:21.791 "data_size": 65536 00:26:21.791 }, 00:26:21.791 { 00:26:21.791 "name": "BaseBdev4", 00:26:21.791 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:21.791 "is_configured": true, 00:26:21.791 "data_offset": 0, 00:26:21.791 "data_size": 65536 00:26:21.791 } 00:26:21.791 ] 00:26:21.791 } 00:26:21.791 } 00:26:21.791 }' 00:26:21.791 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:22.050 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:26:22.050 BaseBdev2 00:26:22.050 BaseBdev3 00:26:22.050 BaseBdev4' 00:26:22.050 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:22.050 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:26:22.050 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:22.309 "name": "NewBaseBdev", 00:26:22.309 "aliases": [ 00:26:22.309 "7c8b176e-10b2-406d-926c-99b63051fd12" 00:26:22.309 ], 00:26:22.309 "product_name": "Malloc disk", 00:26:22.309 "block_size": 512, 00:26:22.309 "num_blocks": 65536, 00:26:22.309 "uuid": "7c8b176e-10b2-406d-926c-99b63051fd12", 00:26:22.309 "assigned_rate_limits": { 00:26:22.309 "rw_ios_per_sec": 0, 00:26:22.309 "rw_mbytes_per_sec": 0, 00:26:22.309 "r_mbytes_per_sec": 0, 00:26:22.309 "w_mbytes_per_sec": 0 00:26:22.309 }, 00:26:22.309 "claimed": true, 00:26:22.309 "claim_type": "exclusive_write", 00:26:22.309 "zoned": false, 00:26:22.309 "supported_io_types": { 00:26:22.309 "read": true, 00:26:22.309 "write": true, 00:26:22.309 "unmap": true, 00:26:22.309 "flush": true, 00:26:22.309 "reset": true, 00:26:22.309 "nvme_admin": false, 00:26:22.309 "nvme_io": false, 00:26:22.309 "nvme_io_md": false, 00:26:22.309 "write_zeroes": true, 00:26:22.309 "zcopy": true, 00:26:22.309 "get_zone_info": false, 00:26:22.309 "zone_management": false, 00:26:22.309 "zone_append": false, 00:26:22.309 "compare": false, 00:26:22.309 "compare_and_write": false, 00:26:22.309 "abort": true, 00:26:22.309 "seek_hole": false, 00:26:22.309 "seek_data": false, 00:26:22.309 "copy": true, 00:26:22.309 "nvme_iov_md": false 00:26:22.309 }, 00:26:22.309 "memory_domains": [ 00:26:22.309 { 00:26:22.309 "dma_device_id": "system", 00:26:22.309 "dma_device_type": 1 00:26:22.309 }, 00:26:22.309 { 00:26:22.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:22.309 "dma_device_type": 2 00:26:22.309 } 00:26:22.309 ], 00:26:22.309 "driver_specific": {} 00:26:22.309 }' 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:22.309 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:22.569 17:20:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:23.137 "name": "BaseBdev2", 00:26:23.137 "aliases": [ 00:26:23.137 "a2a11f5e-36c9-44a5-bc17-7f054f758c7a" 00:26:23.137 ], 00:26:23.137 "product_name": "Malloc disk", 00:26:23.137 "block_size": 512, 00:26:23.137 "num_blocks": 65536, 00:26:23.137 "uuid": "a2a11f5e-36c9-44a5-bc17-7f054f758c7a", 00:26:23.137 "assigned_rate_limits": { 00:26:23.137 "rw_ios_per_sec": 0, 00:26:23.137 "rw_mbytes_per_sec": 0, 00:26:23.137 "r_mbytes_per_sec": 0, 00:26:23.137 "w_mbytes_per_sec": 0 00:26:23.137 }, 00:26:23.137 "claimed": true, 00:26:23.137 "claim_type": "exclusive_write", 00:26:23.137 "zoned": false, 00:26:23.137 "supported_io_types": { 00:26:23.137 "read": true, 00:26:23.137 "write": true, 00:26:23.137 "unmap": true, 00:26:23.137 "flush": true, 00:26:23.137 "reset": true, 00:26:23.137 "nvme_admin": false, 00:26:23.137 "nvme_io": false, 00:26:23.137 "nvme_io_md": false, 00:26:23.137 "write_zeroes": true, 00:26:23.137 "zcopy": true, 00:26:23.137 "get_zone_info": false, 00:26:23.137 "zone_management": false, 00:26:23.137 "zone_append": false, 00:26:23.137 "compare": false, 00:26:23.137 "compare_and_write": false, 00:26:23.137 "abort": true, 00:26:23.137 "seek_hole": false, 00:26:23.137 "seek_data": false, 00:26:23.137 "copy": true, 00:26:23.137 "nvme_iov_md": false 00:26:23.137 }, 00:26:23.137 "memory_domains": [ 00:26:23.137 { 00:26:23.137 "dma_device_id": "system", 00:26:23.137 "dma_device_type": 1 00:26:23.137 }, 00:26:23.137 { 00:26:23.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.137 "dma_device_type": 2 00:26:23.137 } 00:26:23.137 ], 00:26:23.137 "driver_specific": {} 00:26:23.137 }' 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:23.137 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.396 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.396 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:23.396 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:23.396 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:23.396 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:23.654 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:23.654 "name": "BaseBdev3", 00:26:23.654 "aliases": [ 00:26:23.654 "adc44440-1292-4a28-8536-cbf383132460" 00:26:23.654 ], 00:26:23.654 "product_name": "Malloc disk", 00:26:23.654 "block_size": 512, 00:26:23.654 "num_blocks": 65536, 00:26:23.654 "uuid": "adc44440-1292-4a28-8536-cbf383132460", 00:26:23.654 "assigned_rate_limits": { 00:26:23.654 "rw_ios_per_sec": 0, 00:26:23.654 "rw_mbytes_per_sec": 0, 00:26:23.654 "r_mbytes_per_sec": 0, 00:26:23.654 "w_mbytes_per_sec": 0 00:26:23.654 }, 00:26:23.654 "claimed": true, 00:26:23.654 "claim_type": "exclusive_write", 00:26:23.654 "zoned": false, 00:26:23.654 "supported_io_types": { 00:26:23.654 "read": true, 00:26:23.654 "write": true, 00:26:23.654 "unmap": true, 00:26:23.654 "flush": true, 00:26:23.654 "reset": true, 00:26:23.654 "nvme_admin": false, 00:26:23.654 "nvme_io": false, 00:26:23.654 "nvme_io_md": false, 00:26:23.654 "write_zeroes": true, 00:26:23.654 "zcopy": true, 00:26:23.654 "get_zone_info": false, 00:26:23.654 "zone_management": false, 00:26:23.654 "zone_append": false, 00:26:23.654 "compare": false, 00:26:23.655 "compare_and_write": false, 00:26:23.655 "abort": true, 00:26:23.655 "seek_hole": false, 00:26:23.655 "seek_data": false, 00:26:23.655 "copy": true, 00:26:23.655 "nvme_iov_md": false 00:26:23.655 }, 00:26:23.655 "memory_domains": [ 00:26:23.655 { 00:26:23.655 "dma_device_id": "system", 00:26:23.655 "dma_device_type": 1 00:26:23.655 }, 00:26:23.655 { 00:26:23.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.655 "dma_device_type": 2 00:26:23.655 } 00:26:23.655 ], 00:26:23.655 "driver_specific": {} 00:26:23.655 }' 00:26:23.655 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.655 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.655 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:23.655 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.655 17:20:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.655 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:23.655 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.655 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:26:23.913 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:24.171 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:24.171 "name": "BaseBdev4", 00:26:24.171 "aliases": [ 00:26:24.171 "f6b1c274-e556-4df8-9df4-1f3017fa0767" 00:26:24.171 ], 00:26:24.171 "product_name": "Malloc disk", 00:26:24.171 "block_size": 512, 00:26:24.171 "num_blocks": 65536, 00:26:24.171 "uuid": "f6b1c274-e556-4df8-9df4-1f3017fa0767", 00:26:24.171 "assigned_rate_limits": { 00:26:24.171 "rw_ios_per_sec": 0, 00:26:24.171 "rw_mbytes_per_sec": 0, 00:26:24.171 "r_mbytes_per_sec": 0, 00:26:24.171 "w_mbytes_per_sec": 0 00:26:24.171 }, 00:26:24.171 "claimed": true, 00:26:24.171 "claim_type": "exclusive_write", 00:26:24.171 "zoned": false, 00:26:24.171 "supported_io_types": { 00:26:24.171 "read": true, 00:26:24.171 "write": true, 00:26:24.171 "unmap": true, 00:26:24.171 "flush": true, 00:26:24.171 "reset": true, 00:26:24.171 "nvme_admin": false, 00:26:24.171 "nvme_io": false, 00:26:24.171 "nvme_io_md": false, 00:26:24.171 "write_zeroes": true, 00:26:24.171 "zcopy": true, 00:26:24.171 "get_zone_info": false, 00:26:24.171 "zone_management": false, 00:26:24.171 "zone_append": false, 00:26:24.171 "compare": false, 00:26:24.171 "compare_and_write": false, 00:26:24.171 "abort": true, 00:26:24.171 "seek_hole": false, 00:26:24.171 "seek_data": false, 00:26:24.171 "copy": true, 00:26:24.171 "nvme_iov_md": false 00:26:24.171 }, 00:26:24.171 "memory_domains": [ 00:26:24.171 { 00:26:24.171 "dma_device_id": "system", 00:26:24.171 "dma_device_type": 1 00:26:24.171 }, 00:26:24.171 { 00:26:24.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:24.171 "dma_device_type": 2 00:26:24.171 } 00:26:24.171 ], 00:26:24.171 "driver_specific": {} 00:26:24.171 }' 00:26:24.171 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:24.171 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:24.171 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:24.171 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:24.429 17:20:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:24.997 [2024-07-23 17:20:20.305379] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:24.997 [2024-07-23 17:20:20.305413] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:24.997 [2024-07-23 17:20:20.305479] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:24.997 [2024-07-23 17:20:20.305743] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:24.997 [2024-07-23 17:20:20.305756] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2652ab0 name Existed_Raid, state offline 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 9789 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 9789 ']' 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 9789 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 9789 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 9789' 00:26:24.997 killing process with pid 9789 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 9789 00:26:24.997 [2024-07-23 17:20:20.389999] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:24.997 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 9789 00:26:25.256 [2024-07-23 17:20:20.433404] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:25.256 17:20:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:26:25.256 00:26:25.257 real 0m35.275s 00:26:25.257 user 1m4.895s 00:26:25.257 sys 0m6.282s 00:26:25.257 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:25.257 17:20:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:26:25.257 ************************************ 00:26:25.257 END TEST raid_state_function_test 00:26:25.257 ************************************ 00:26:25.516 17:20:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:25.516 17:20:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:26:25.516 17:20:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:25.516 17:20:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:25.516 17:20:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:25.516 ************************************ 00:26:25.516 START TEST raid_state_function_test_sb 00:26:25.516 ************************************ 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=15012 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 15012' 00:26:25.516 Process raid pid: 15012 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 15012 /var/tmp/spdk-raid.sock 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 15012 ']' 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:25.516 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:25.516 17:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:25.516 [2024-07-23 17:20:20.803268] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:26:25.516 [2024-07-23 17:20:20.803322] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:25.517 [2024-07-23 17:20:20.917667] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:25.776 [2024-07-23 17:20:20.969054] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:25.776 [2024-07-23 17:20:21.038551] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:25.776 [2024-07-23 17:20:21.038579] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:25.776 17:20:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:25.776 17:20:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:26:25.776 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:26:26.035 [2024-07-23 17:20:21.247253] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:26.035 [2024-07-23 17:20:21.247290] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:26.035 [2024-07-23 17:20:21.247301] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:26.035 [2024-07-23 17:20:21.247312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:26.035 [2024-07-23 17:20:21.247321] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:26:26.035 [2024-07-23 17:20:21.247332] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:26:26.035 [2024-07-23 17:20:21.247341] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:26:26.035 [2024-07-23 17:20:21.247352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.035 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:26.294 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.294 "name": "Existed_Raid", 00:26:26.294 "uuid": "2b953a9e-1c1f-4b62-9c26-2d14848bbd6a", 00:26:26.294 "strip_size_kb": 0, 00:26:26.294 "state": "configuring", 00:26:26.294 "raid_level": "raid1", 00:26:26.294 "superblock": true, 00:26:26.294 "num_base_bdevs": 4, 00:26:26.294 "num_base_bdevs_discovered": 0, 00:26:26.294 "num_base_bdevs_operational": 4, 00:26:26.294 "base_bdevs_list": [ 00:26:26.294 { 00:26:26.294 "name": "BaseBdev1", 00:26:26.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.294 "is_configured": false, 00:26:26.294 "data_offset": 0, 00:26:26.294 "data_size": 0 00:26:26.294 }, 00:26:26.294 { 00:26:26.294 "name": "BaseBdev2", 00:26:26.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.294 "is_configured": false, 00:26:26.294 "data_offset": 0, 00:26:26.294 "data_size": 0 00:26:26.294 }, 00:26:26.294 { 00:26:26.294 "name": "BaseBdev3", 00:26:26.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.294 "is_configured": false, 00:26:26.294 "data_offset": 0, 00:26:26.294 "data_size": 0 00:26:26.294 }, 00:26:26.294 { 00:26:26.294 "name": "BaseBdev4", 00:26:26.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.294 "is_configured": false, 00:26:26.294 "data_offset": 0, 00:26:26.294 "data_size": 0 00:26:26.294 } 00:26:26.294 ] 00:26:26.294 }' 00:26:26.294 17:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.294 17:20:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:26.862 17:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:26.862 [2024-07-23 17:20:22.221672] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:26.862 [2024-07-23 17:20:22.221700] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x93f410 name Existed_Raid, state configuring 00:26:26.862 17:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:26:27.120 [2024-07-23 17:20:22.390154] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:27.120 [2024-07-23 17:20:22.390180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:27.120 [2024-07-23 17:20:22.390190] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:27.120 [2024-07-23 17:20:22.390201] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:27.120 [2024-07-23 17:20:22.390209] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:26:27.120 [2024-07-23 17:20:22.390220] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:26:27.120 [2024-07-23 17:20:22.390229] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:26:27.120 [2024-07-23 17:20:22.390240] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:26:27.120 17:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:26:27.379 [2024-07-23 17:20:22.576488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:27.379 BaseBdev1 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:27.379 17:20:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:27.638 [ 00:26:27.638 { 00:26:27.638 "name": "BaseBdev1", 00:26:27.638 "aliases": [ 00:26:27.638 "31e87743-ca57-4025-ab8b-00e95881fe8f" 00:26:27.638 ], 00:26:27.638 "product_name": "Malloc disk", 00:26:27.638 "block_size": 512, 00:26:27.638 "num_blocks": 65536, 00:26:27.638 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:27.638 "assigned_rate_limits": { 00:26:27.638 "rw_ios_per_sec": 0, 00:26:27.638 "rw_mbytes_per_sec": 0, 00:26:27.638 "r_mbytes_per_sec": 0, 00:26:27.638 "w_mbytes_per_sec": 0 00:26:27.638 }, 00:26:27.638 "claimed": true, 00:26:27.638 "claim_type": "exclusive_write", 00:26:27.638 "zoned": false, 00:26:27.638 "supported_io_types": { 00:26:27.638 "read": true, 00:26:27.638 "write": true, 00:26:27.638 "unmap": true, 00:26:27.638 "flush": true, 00:26:27.638 "reset": true, 00:26:27.638 "nvme_admin": false, 00:26:27.638 "nvme_io": false, 00:26:27.638 "nvme_io_md": false, 00:26:27.638 "write_zeroes": true, 00:26:27.638 "zcopy": true, 00:26:27.638 "get_zone_info": false, 00:26:27.638 "zone_management": false, 00:26:27.638 "zone_append": false, 00:26:27.638 "compare": false, 00:26:27.638 "compare_and_write": false, 00:26:27.638 "abort": true, 00:26:27.638 "seek_hole": false, 00:26:27.638 "seek_data": false, 00:26:27.638 "copy": true, 00:26:27.638 "nvme_iov_md": false 00:26:27.638 }, 00:26:27.638 "memory_domains": [ 00:26:27.638 { 00:26:27.638 "dma_device_id": "system", 00:26:27.638 "dma_device_type": 1 00:26:27.638 }, 00:26:27.638 { 00:26:27.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:27.638 "dma_device_type": 2 00:26:27.638 } 00:26:27.638 ], 00:26:27.638 "driver_specific": {} 00:26:27.638 } 00:26:27.638 ] 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.638 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:27.897 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.897 "name": "Existed_Raid", 00:26:27.897 "uuid": "be97d664-39fd-4d2c-b23b-d60277a400d6", 00:26:27.897 "strip_size_kb": 0, 00:26:27.897 "state": "configuring", 00:26:27.897 "raid_level": "raid1", 00:26:27.897 "superblock": true, 00:26:27.897 "num_base_bdevs": 4, 00:26:27.897 "num_base_bdevs_discovered": 1, 00:26:27.897 "num_base_bdevs_operational": 4, 00:26:27.897 "base_bdevs_list": [ 00:26:27.897 { 00:26:27.897 "name": "BaseBdev1", 00:26:27.897 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:27.897 "is_configured": true, 00:26:27.897 "data_offset": 2048, 00:26:27.897 "data_size": 63488 00:26:27.897 }, 00:26:27.897 { 00:26:27.897 "name": "BaseBdev2", 00:26:27.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.897 "is_configured": false, 00:26:27.897 "data_offset": 0, 00:26:27.897 "data_size": 0 00:26:27.897 }, 00:26:27.897 { 00:26:27.897 "name": "BaseBdev3", 00:26:27.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.897 "is_configured": false, 00:26:27.897 "data_offset": 0, 00:26:27.897 "data_size": 0 00:26:27.897 }, 00:26:27.897 { 00:26:27.897 "name": "BaseBdev4", 00:26:27.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.897 "is_configured": false, 00:26:27.897 "data_offset": 0, 00:26:27.897 "data_size": 0 00:26:27.897 } 00:26:27.897 ] 00:26:27.897 }' 00:26:27.897 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.897 17:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:28.463 17:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:28.722 [2024-07-23 17:20:24.024323] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:28.722 [2024-07-23 17:20:24.024360] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x93ed40 name Existed_Raid, state configuring 00:26:28.722 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:26:28.981 [2024-07-23 17:20:24.208850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:28.981 [2024-07-23 17:20:24.210338] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:28.981 [2024-07-23 17:20:24.210371] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:28.981 [2024-07-23 17:20:24.210381] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:26:28.981 [2024-07-23 17:20:24.210393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:26:28.981 [2024-07-23 17:20:24.210402] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:26:28.981 [2024-07-23 17:20:24.210414] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.981 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:29.240 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.240 "name": "Existed_Raid", 00:26:29.240 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:29.240 "strip_size_kb": 0, 00:26:29.240 "state": "configuring", 00:26:29.240 "raid_level": "raid1", 00:26:29.240 "superblock": true, 00:26:29.240 "num_base_bdevs": 4, 00:26:29.240 "num_base_bdevs_discovered": 1, 00:26:29.240 "num_base_bdevs_operational": 4, 00:26:29.240 "base_bdevs_list": [ 00:26:29.240 { 00:26:29.240 "name": "BaseBdev1", 00:26:29.240 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:29.240 "is_configured": true, 00:26:29.240 "data_offset": 2048, 00:26:29.240 "data_size": 63488 00:26:29.240 }, 00:26:29.240 { 00:26:29.240 "name": "BaseBdev2", 00:26:29.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.240 "is_configured": false, 00:26:29.240 "data_offset": 0, 00:26:29.240 "data_size": 0 00:26:29.240 }, 00:26:29.240 { 00:26:29.240 "name": "BaseBdev3", 00:26:29.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.240 "is_configured": false, 00:26:29.240 "data_offset": 0, 00:26:29.240 "data_size": 0 00:26:29.240 }, 00:26:29.240 { 00:26:29.240 "name": "BaseBdev4", 00:26:29.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.240 "is_configured": false, 00:26:29.240 "data_offset": 0, 00:26:29.240 "data_size": 0 00:26:29.240 } 00:26:29.240 ] 00:26:29.240 }' 00:26:29.240 17:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.240 17:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:29.809 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:26:30.067 [2024-07-23 17:20:25.327223] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:30.067 BaseBdev2 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:30.067 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:30.326 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:30.327 [ 00:26:30.327 { 00:26:30.327 "name": "BaseBdev2", 00:26:30.327 "aliases": [ 00:26:30.327 "71c27226-8526-47e6-8c9b-6285ae2e9db9" 00:26:30.327 ], 00:26:30.327 "product_name": "Malloc disk", 00:26:30.327 "block_size": 512, 00:26:30.327 "num_blocks": 65536, 00:26:30.327 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:30.327 "assigned_rate_limits": { 00:26:30.327 "rw_ios_per_sec": 0, 00:26:30.327 "rw_mbytes_per_sec": 0, 00:26:30.327 "r_mbytes_per_sec": 0, 00:26:30.327 "w_mbytes_per_sec": 0 00:26:30.327 }, 00:26:30.327 "claimed": true, 00:26:30.327 "claim_type": "exclusive_write", 00:26:30.327 "zoned": false, 00:26:30.327 "supported_io_types": { 00:26:30.327 "read": true, 00:26:30.327 "write": true, 00:26:30.327 "unmap": true, 00:26:30.327 "flush": true, 00:26:30.327 "reset": true, 00:26:30.327 "nvme_admin": false, 00:26:30.327 "nvme_io": false, 00:26:30.327 "nvme_io_md": false, 00:26:30.327 "write_zeroes": true, 00:26:30.327 "zcopy": true, 00:26:30.327 "get_zone_info": false, 00:26:30.327 "zone_management": false, 00:26:30.327 "zone_append": false, 00:26:30.327 "compare": false, 00:26:30.327 "compare_and_write": false, 00:26:30.327 "abort": true, 00:26:30.327 "seek_hole": false, 00:26:30.327 "seek_data": false, 00:26:30.327 "copy": true, 00:26:30.327 "nvme_iov_md": false 00:26:30.327 }, 00:26:30.327 "memory_domains": [ 00:26:30.327 { 00:26:30.327 "dma_device_id": "system", 00:26:30.327 "dma_device_type": 1 00:26:30.327 }, 00:26:30.327 { 00:26:30.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:30.327 "dma_device_type": 2 00:26:30.327 } 00:26:30.327 ], 00:26:30.327 "driver_specific": {} 00:26:30.327 } 00:26:30.327 ] 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.327 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:30.587 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:30.587 "name": "Existed_Raid", 00:26:30.587 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:30.587 "strip_size_kb": 0, 00:26:30.587 "state": "configuring", 00:26:30.587 "raid_level": "raid1", 00:26:30.587 "superblock": true, 00:26:30.587 "num_base_bdevs": 4, 00:26:30.587 "num_base_bdevs_discovered": 2, 00:26:30.587 "num_base_bdevs_operational": 4, 00:26:30.587 "base_bdevs_list": [ 00:26:30.587 { 00:26:30.587 "name": "BaseBdev1", 00:26:30.587 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:30.587 "is_configured": true, 00:26:30.587 "data_offset": 2048, 00:26:30.587 "data_size": 63488 00:26:30.587 }, 00:26:30.587 { 00:26:30.587 "name": "BaseBdev2", 00:26:30.587 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:30.587 "is_configured": true, 00:26:30.587 "data_offset": 2048, 00:26:30.587 "data_size": 63488 00:26:30.587 }, 00:26:30.587 { 00:26:30.587 "name": "BaseBdev3", 00:26:30.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.587 "is_configured": false, 00:26:30.587 "data_offset": 0, 00:26:30.587 "data_size": 0 00:26:30.587 }, 00:26:30.587 { 00:26:30.587 "name": "BaseBdev4", 00:26:30.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.587 "is_configured": false, 00:26:30.587 "data_offset": 0, 00:26:30.587 "data_size": 0 00:26:30.587 } 00:26:30.587 ] 00:26:30.587 }' 00:26:30.587 17:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:30.587 17:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:31.154 17:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:26:31.413 [2024-07-23 17:20:26.798584] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:31.413 BaseBdev3 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:31.413 17:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:31.672 17:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:26:31.931 [ 00:26:31.931 { 00:26:31.931 "name": "BaseBdev3", 00:26:31.931 "aliases": [ 00:26:31.931 "3b3b6fee-f02d-4cd2-97c5-4d8173541347" 00:26:31.931 ], 00:26:31.931 "product_name": "Malloc disk", 00:26:31.931 "block_size": 512, 00:26:31.931 "num_blocks": 65536, 00:26:31.931 "uuid": "3b3b6fee-f02d-4cd2-97c5-4d8173541347", 00:26:31.931 "assigned_rate_limits": { 00:26:31.931 "rw_ios_per_sec": 0, 00:26:31.931 "rw_mbytes_per_sec": 0, 00:26:31.931 "r_mbytes_per_sec": 0, 00:26:31.931 "w_mbytes_per_sec": 0 00:26:31.931 }, 00:26:31.931 "claimed": true, 00:26:31.931 "claim_type": "exclusive_write", 00:26:31.931 "zoned": false, 00:26:31.931 "supported_io_types": { 00:26:31.931 "read": true, 00:26:31.931 "write": true, 00:26:31.931 "unmap": true, 00:26:31.931 "flush": true, 00:26:31.931 "reset": true, 00:26:31.931 "nvme_admin": false, 00:26:31.931 "nvme_io": false, 00:26:31.931 "nvme_io_md": false, 00:26:31.931 "write_zeroes": true, 00:26:31.931 "zcopy": true, 00:26:31.931 "get_zone_info": false, 00:26:31.931 "zone_management": false, 00:26:31.931 "zone_append": false, 00:26:31.931 "compare": false, 00:26:31.931 "compare_and_write": false, 00:26:31.931 "abort": true, 00:26:31.931 "seek_hole": false, 00:26:31.931 "seek_data": false, 00:26:31.931 "copy": true, 00:26:31.931 "nvme_iov_md": false 00:26:31.931 }, 00:26:31.931 "memory_domains": [ 00:26:31.931 { 00:26:31.931 "dma_device_id": "system", 00:26:31.931 "dma_device_type": 1 00:26:31.931 }, 00:26:31.931 { 00:26:31.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:31.931 "dma_device_type": 2 00:26:31.931 } 00:26:31.931 ], 00:26:31.931 "driver_specific": {} 00:26:31.931 } 00:26:31.931 ] 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.931 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.932 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.932 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:32.282 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:32.282 "name": "Existed_Raid", 00:26:32.282 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:32.282 "strip_size_kb": 0, 00:26:32.282 "state": "configuring", 00:26:32.282 "raid_level": "raid1", 00:26:32.282 "superblock": true, 00:26:32.282 "num_base_bdevs": 4, 00:26:32.282 "num_base_bdevs_discovered": 3, 00:26:32.282 "num_base_bdevs_operational": 4, 00:26:32.282 "base_bdevs_list": [ 00:26:32.282 { 00:26:32.282 "name": "BaseBdev1", 00:26:32.282 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:32.282 "is_configured": true, 00:26:32.282 "data_offset": 2048, 00:26:32.282 "data_size": 63488 00:26:32.282 }, 00:26:32.282 { 00:26:32.282 "name": "BaseBdev2", 00:26:32.282 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:32.283 "is_configured": true, 00:26:32.283 "data_offset": 2048, 00:26:32.283 "data_size": 63488 00:26:32.283 }, 00:26:32.283 { 00:26:32.283 "name": "BaseBdev3", 00:26:32.283 "uuid": "3b3b6fee-f02d-4cd2-97c5-4d8173541347", 00:26:32.283 "is_configured": true, 00:26:32.283 "data_offset": 2048, 00:26:32.283 "data_size": 63488 00:26:32.283 }, 00:26:32.283 { 00:26:32.283 "name": "BaseBdev4", 00:26:32.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.283 "is_configured": false, 00:26:32.283 "data_offset": 0, 00:26:32.283 "data_size": 0 00:26:32.283 } 00:26:32.283 ] 00:26:32.283 }' 00:26:32.283 17:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:32.283 17:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:32.850 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:26:33.108 [2024-07-23 17:20:28.406251] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:33.108 [2024-07-23 17:20:28.406424] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x93e990 00:26:33.108 [2024-07-23 17:20:28.406439] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:33.108 [2024-07-23 17:20:28.406613] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaea0d0 00:26:33.108 [2024-07-23 17:20:28.406749] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x93e990 00:26:33.108 [2024-07-23 17:20:28.406760] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x93e990 00:26:33.108 [2024-07-23 17:20:28.406855] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.108 BaseBdev4 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:33.108 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:33.367 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:26:33.625 [ 00:26:33.625 { 00:26:33.625 "name": "BaseBdev4", 00:26:33.625 "aliases": [ 00:26:33.625 "33827cf7-ce73-4fe9-ab17-199dd89a012b" 00:26:33.625 ], 00:26:33.625 "product_name": "Malloc disk", 00:26:33.625 "block_size": 512, 00:26:33.625 "num_blocks": 65536, 00:26:33.625 "uuid": "33827cf7-ce73-4fe9-ab17-199dd89a012b", 00:26:33.625 "assigned_rate_limits": { 00:26:33.626 "rw_ios_per_sec": 0, 00:26:33.626 "rw_mbytes_per_sec": 0, 00:26:33.626 "r_mbytes_per_sec": 0, 00:26:33.626 "w_mbytes_per_sec": 0 00:26:33.626 }, 00:26:33.626 "claimed": true, 00:26:33.626 "claim_type": "exclusive_write", 00:26:33.626 "zoned": false, 00:26:33.626 "supported_io_types": { 00:26:33.626 "read": true, 00:26:33.626 "write": true, 00:26:33.626 "unmap": true, 00:26:33.626 "flush": true, 00:26:33.626 "reset": true, 00:26:33.626 "nvme_admin": false, 00:26:33.626 "nvme_io": false, 00:26:33.626 "nvme_io_md": false, 00:26:33.626 "write_zeroes": true, 00:26:33.626 "zcopy": true, 00:26:33.626 "get_zone_info": false, 00:26:33.626 "zone_management": false, 00:26:33.626 "zone_append": false, 00:26:33.626 "compare": false, 00:26:33.626 "compare_and_write": false, 00:26:33.626 "abort": true, 00:26:33.626 "seek_hole": false, 00:26:33.626 "seek_data": false, 00:26:33.626 "copy": true, 00:26:33.626 "nvme_iov_md": false 00:26:33.626 }, 00:26:33.626 "memory_domains": [ 00:26:33.626 { 00:26:33.626 "dma_device_id": "system", 00:26:33.626 "dma_device_type": 1 00:26:33.626 }, 00:26:33.626 { 00:26:33.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.626 "dma_device_type": 2 00:26:33.626 } 00:26:33.626 ], 00:26:33.626 "driver_specific": {} 00:26:33.626 } 00:26:33.626 ] 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.626 17:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:33.885 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:33.885 "name": "Existed_Raid", 00:26:33.885 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:33.885 "strip_size_kb": 0, 00:26:33.885 "state": "online", 00:26:33.885 "raid_level": "raid1", 00:26:33.885 "superblock": true, 00:26:33.885 "num_base_bdevs": 4, 00:26:33.885 "num_base_bdevs_discovered": 4, 00:26:33.885 "num_base_bdevs_operational": 4, 00:26:33.885 "base_bdevs_list": [ 00:26:33.885 { 00:26:33.885 "name": "BaseBdev1", 00:26:33.885 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:33.885 "is_configured": true, 00:26:33.885 "data_offset": 2048, 00:26:33.885 "data_size": 63488 00:26:33.885 }, 00:26:33.885 { 00:26:33.885 "name": "BaseBdev2", 00:26:33.886 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:33.886 "is_configured": true, 00:26:33.886 "data_offset": 2048, 00:26:33.886 "data_size": 63488 00:26:33.886 }, 00:26:33.886 { 00:26:33.886 "name": "BaseBdev3", 00:26:33.886 "uuid": "3b3b6fee-f02d-4cd2-97c5-4d8173541347", 00:26:33.886 "is_configured": true, 00:26:33.886 "data_offset": 2048, 00:26:33.886 "data_size": 63488 00:26:33.886 }, 00:26:33.886 { 00:26:33.886 "name": "BaseBdev4", 00:26:33.886 "uuid": "33827cf7-ce73-4fe9-ab17-199dd89a012b", 00:26:33.886 "is_configured": true, 00:26:33.886 "data_offset": 2048, 00:26:33.886 "data_size": 63488 00:26:33.886 } 00:26:33.886 ] 00:26:33.886 }' 00:26:33.886 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:33.886 17:20:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:34.453 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:34.453 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:34.453 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:34.453 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:34.453 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:34.454 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:26:34.454 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:34.454 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:34.745 [2024-07-23 17:20:29.974759] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:34.745 17:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:34.745 "name": "Existed_Raid", 00:26:34.745 "aliases": [ 00:26:34.745 "15e3d4e9-cb09-41b1-824f-97d38d1b6419" 00:26:34.745 ], 00:26:34.745 "product_name": "Raid Volume", 00:26:34.745 "block_size": 512, 00:26:34.745 "num_blocks": 63488, 00:26:34.745 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:34.745 "assigned_rate_limits": { 00:26:34.745 "rw_ios_per_sec": 0, 00:26:34.745 "rw_mbytes_per_sec": 0, 00:26:34.745 "r_mbytes_per_sec": 0, 00:26:34.745 "w_mbytes_per_sec": 0 00:26:34.745 }, 00:26:34.745 "claimed": false, 00:26:34.745 "zoned": false, 00:26:34.745 "supported_io_types": { 00:26:34.745 "read": true, 00:26:34.745 "write": true, 00:26:34.745 "unmap": false, 00:26:34.745 "flush": false, 00:26:34.745 "reset": true, 00:26:34.745 "nvme_admin": false, 00:26:34.745 "nvme_io": false, 00:26:34.745 "nvme_io_md": false, 00:26:34.745 "write_zeroes": true, 00:26:34.745 "zcopy": false, 00:26:34.745 "get_zone_info": false, 00:26:34.745 "zone_management": false, 00:26:34.745 "zone_append": false, 00:26:34.745 "compare": false, 00:26:34.745 "compare_and_write": false, 00:26:34.745 "abort": false, 00:26:34.745 "seek_hole": false, 00:26:34.745 "seek_data": false, 00:26:34.745 "copy": false, 00:26:34.745 "nvme_iov_md": false 00:26:34.745 }, 00:26:34.745 "memory_domains": [ 00:26:34.745 { 00:26:34.745 "dma_device_id": "system", 00:26:34.745 "dma_device_type": 1 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.745 "dma_device_type": 2 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "system", 00:26:34.745 "dma_device_type": 1 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.745 "dma_device_type": 2 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "system", 00:26:34.745 "dma_device_type": 1 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.745 "dma_device_type": 2 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "system", 00:26:34.745 "dma_device_type": 1 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.745 "dma_device_type": 2 00:26:34.745 } 00:26:34.745 ], 00:26:34.745 "driver_specific": { 00:26:34.745 "raid": { 00:26:34.745 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:34.745 "strip_size_kb": 0, 00:26:34.745 "state": "online", 00:26:34.745 "raid_level": "raid1", 00:26:34.745 "superblock": true, 00:26:34.745 "num_base_bdevs": 4, 00:26:34.745 "num_base_bdevs_discovered": 4, 00:26:34.745 "num_base_bdevs_operational": 4, 00:26:34.745 "base_bdevs_list": [ 00:26:34.745 { 00:26:34.745 "name": "BaseBdev1", 00:26:34.745 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:34.745 "is_configured": true, 00:26:34.745 "data_offset": 2048, 00:26:34.745 "data_size": 63488 00:26:34.745 }, 00:26:34.745 { 00:26:34.745 "name": "BaseBdev2", 00:26:34.746 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:34.746 "is_configured": true, 00:26:34.746 "data_offset": 2048, 00:26:34.746 "data_size": 63488 00:26:34.746 }, 00:26:34.746 { 00:26:34.746 "name": "BaseBdev3", 00:26:34.746 "uuid": "3b3b6fee-f02d-4cd2-97c5-4d8173541347", 00:26:34.746 "is_configured": true, 00:26:34.746 "data_offset": 2048, 00:26:34.746 "data_size": 63488 00:26:34.746 }, 00:26:34.746 { 00:26:34.746 "name": "BaseBdev4", 00:26:34.746 "uuid": "33827cf7-ce73-4fe9-ab17-199dd89a012b", 00:26:34.746 "is_configured": true, 00:26:34.746 "data_offset": 2048, 00:26:34.746 "data_size": 63488 00:26:34.746 } 00:26:34.746 ] 00:26:34.746 } 00:26:34.746 } 00:26:34.746 }' 00:26:34.746 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:34.746 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:34.746 BaseBdev2 00:26:34.746 BaseBdev3 00:26:34.746 BaseBdev4' 00:26:34.746 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:34.746 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:34.746 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:35.038 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:35.038 "name": "BaseBdev1", 00:26:35.038 "aliases": [ 00:26:35.038 "31e87743-ca57-4025-ab8b-00e95881fe8f" 00:26:35.038 ], 00:26:35.038 "product_name": "Malloc disk", 00:26:35.038 "block_size": 512, 00:26:35.038 "num_blocks": 65536, 00:26:35.038 "uuid": "31e87743-ca57-4025-ab8b-00e95881fe8f", 00:26:35.038 "assigned_rate_limits": { 00:26:35.038 "rw_ios_per_sec": 0, 00:26:35.038 "rw_mbytes_per_sec": 0, 00:26:35.038 "r_mbytes_per_sec": 0, 00:26:35.038 "w_mbytes_per_sec": 0 00:26:35.038 }, 00:26:35.038 "claimed": true, 00:26:35.038 "claim_type": "exclusive_write", 00:26:35.038 "zoned": false, 00:26:35.038 "supported_io_types": { 00:26:35.038 "read": true, 00:26:35.038 "write": true, 00:26:35.038 "unmap": true, 00:26:35.038 "flush": true, 00:26:35.038 "reset": true, 00:26:35.038 "nvme_admin": false, 00:26:35.038 "nvme_io": false, 00:26:35.038 "nvme_io_md": false, 00:26:35.038 "write_zeroes": true, 00:26:35.038 "zcopy": true, 00:26:35.038 "get_zone_info": false, 00:26:35.038 "zone_management": false, 00:26:35.038 "zone_append": false, 00:26:35.038 "compare": false, 00:26:35.038 "compare_and_write": false, 00:26:35.038 "abort": true, 00:26:35.038 "seek_hole": false, 00:26:35.038 "seek_data": false, 00:26:35.038 "copy": true, 00:26:35.038 "nvme_iov_md": false 00:26:35.038 }, 00:26:35.038 "memory_domains": [ 00:26:35.038 { 00:26:35.038 "dma_device_id": "system", 00:26:35.038 "dma_device_type": 1 00:26:35.038 }, 00:26:35.038 { 00:26:35.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.038 "dma_device_type": 2 00:26:35.038 } 00:26:35.038 ], 00:26:35.038 "driver_specific": {} 00:26:35.038 }' 00:26:35.038 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.038 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.038 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:35.038 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:35.038 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:35.296 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:35.555 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:35.555 "name": "BaseBdev2", 00:26:35.555 "aliases": [ 00:26:35.555 "71c27226-8526-47e6-8c9b-6285ae2e9db9" 00:26:35.555 ], 00:26:35.555 "product_name": "Malloc disk", 00:26:35.555 "block_size": 512, 00:26:35.555 "num_blocks": 65536, 00:26:35.555 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:35.555 "assigned_rate_limits": { 00:26:35.555 "rw_ios_per_sec": 0, 00:26:35.555 "rw_mbytes_per_sec": 0, 00:26:35.555 "r_mbytes_per_sec": 0, 00:26:35.555 "w_mbytes_per_sec": 0 00:26:35.555 }, 00:26:35.555 "claimed": true, 00:26:35.555 "claim_type": "exclusive_write", 00:26:35.555 "zoned": false, 00:26:35.555 "supported_io_types": { 00:26:35.555 "read": true, 00:26:35.555 "write": true, 00:26:35.555 "unmap": true, 00:26:35.555 "flush": true, 00:26:35.555 "reset": true, 00:26:35.555 "nvme_admin": false, 00:26:35.555 "nvme_io": false, 00:26:35.555 "nvme_io_md": false, 00:26:35.555 "write_zeroes": true, 00:26:35.555 "zcopy": true, 00:26:35.555 "get_zone_info": false, 00:26:35.555 "zone_management": false, 00:26:35.555 "zone_append": false, 00:26:35.555 "compare": false, 00:26:35.555 "compare_and_write": false, 00:26:35.555 "abort": true, 00:26:35.555 "seek_hole": false, 00:26:35.555 "seek_data": false, 00:26:35.555 "copy": true, 00:26:35.555 "nvme_iov_md": false 00:26:35.555 }, 00:26:35.555 "memory_domains": [ 00:26:35.555 { 00:26:35.555 "dma_device_id": "system", 00:26:35.555 "dma_device_type": 1 00:26:35.555 }, 00:26:35.555 { 00:26:35.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.555 "dma_device_type": 2 00:26:35.555 } 00:26:35.555 ], 00:26:35.555 "driver_specific": {} 00:26:35.555 }' 00:26:35.555 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.555 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.813 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:35.813 17:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:35.813 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.071 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:36.071 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:36.071 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:36.071 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:36.329 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:36.329 "name": "BaseBdev3", 00:26:36.329 "aliases": [ 00:26:36.329 "3b3b6fee-f02d-4cd2-97c5-4d8173541347" 00:26:36.329 ], 00:26:36.329 "product_name": "Malloc disk", 00:26:36.329 "block_size": 512, 00:26:36.329 "num_blocks": 65536, 00:26:36.329 "uuid": "3b3b6fee-f02d-4cd2-97c5-4d8173541347", 00:26:36.329 "assigned_rate_limits": { 00:26:36.330 "rw_ios_per_sec": 0, 00:26:36.330 "rw_mbytes_per_sec": 0, 00:26:36.330 "r_mbytes_per_sec": 0, 00:26:36.330 "w_mbytes_per_sec": 0 00:26:36.330 }, 00:26:36.330 "claimed": true, 00:26:36.330 "claim_type": "exclusive_write", 00:26:36.330 "zoned": false, 00:26:36.330 "supported_io_types": { 00:26:36.330 "read": true, 00:26:36.330 "write": true, 00:26:36.330 "unmap": true, 00:26:36.330 "flush": true, 00:26:36.330 "reset": true, 00:26:36.330 "nvme_admin": false, 00:26:36.330 "nvme_io": false, 00:26:36.330 "nvme_io_md": false, 00:26:36.330 "write_zeroes": true, 00:26:36.330 "zcopy": true, 00:26:36.330 "get_zone_info": false, 00:26:36.330 "zone_management": false, 00:26:36.330 "zone_append": false, 00:26:36.330 "compare": false, 00:26:36.330 "compare_and_write": false, 00:26:36.330 "abort": true, 00:26:36.330 "seek_hole": false, 00:26:36.330 "seek_data": false, 00:26:36.330 "copy": true, 00:26:36.330 "nvme_iov_md": false 00:26:36.330 }, 00:26:36.330 "memory_domains": [ 00:26:36.330 { 00:26:36.330 "dma_device_id": "system", 00:26:36.330 "dma_device_type": 1 00:26:36.330 }, 00:26:36.330 { 00:26:36.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:36.330 "dma_device_type": 2 00:26:36.330 } 00:26:36.330 ], 00:26:36.330 "driver_specific": {} 00:26:36.330 }' 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:36.330 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:26:36.588 17:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:36.847 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:36.847 "name": "BaseBdev4", 00:26:36.847 "aliases": [ 00:26:36.847 "33827cf7-ce73-4fe9-ab17-199dd89a012b" 00:26:36.847 ], 00:26:36.847 "product_name": "Malloc disk", 00:26:36.847 "block_size": 512, 00:26:36.847 "num_blocks": 65536, 00:26:36.847 "uuid": "33827cf7-ce73-4fe9-ab17-199dd89a012b", 00:26:36.847 "assigned_rate_limits": { 00:26:36.847 "rw_ios_per_sec": 0, 00:26:36.847 "rw_mbytes_per_sec": 0, 00:26:36.847 "r_mbytes_per_sec": 0, 00:26:36.847 "w_mbytes_per_sec": 0 00:26:36.847 }, 00:26:36.847 "claimed": true, 00:26:36.847 "claim_type": "exclusive_write", 00:26:36.847 "zoned": false, 00:26:36.847 "supported_io_types": { 00:26:36.847 "read": true, 00:26:36.847 "write": true, 00:26:36.847 "unmap": true, 00:26:36.847 "flush": true, 00:26:36.847 "reset": true, 00:26:36.847 "nvme_admin": false, 00:26:36.847 "nvme_io": false, 00:26:36.847 "nvme_io_md": false, 00:26:36.847 "write_zeroes": true, 00:26:36.847 "zcopy": true, 00:26:36.847 "get_zone_info": false, 00:26:36.847 "zone_management": false, 00:26:36.847 "zone_append": false, 00:26:36.847 "compare": false, 00:26:36.847 "compare_and_write": false, 00:26:36.847 "abort": true, 00:26:36.847 "seek_hole": false, 00:26:36.847 "seek_data": false, 00:26:36.847 "copy": true, 00:26:36.847 "nvme_iov_md": false 00:26:36.847 }, 00:26:36.847 "memory_domains": [ 00:26:36.847 { 00:26:36.847 "dma_device_id": "system", 00:26:36.847 "dma_device_type": 1 00:26:36.847 }, 00:26:36.847 { 00:26:36.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:36.847 "dma_device_type": 2 00:26:36.847 } 00:26:36.847 ], 00:26:36.847 "driver_specific": {} 00:26:36.847 }' 00:26:36.847 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.847 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.847 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:36.847 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.847 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:37.106 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:37.673 [2024-07-23 17:20:32.954361] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:37.673 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:37.673 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:37.673 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.674 17:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:37.933 17:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.933 "name": "Existed_Raid", 00:26:37.933 "uuid": "15e3d4e9-cb09-41b1-824f-97d38d1b6419", 00:26:37.933 "strip_size_kb": 0, 00:26:37.933 "state": "online", 00:26:37.933 "raid_level": "raid1", 00:26:37.933 "superblock": true, 00:26:37.933 "num_base_bdevs": 4, 00:26:37.933 "num_base_bdevs_discovered": 3, 00:26:37.933 "num_base_bdevs_operational": 3, 00:26:37.933 "base_bdevs_list": [ 00:26:37.933 { 00:26:37.933 "name": null, 00:26:37.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.933 "is_configured": false, 00:26:37.933 "data_offset": 2048, 00:26:37.933 "data_size": 63488 00:26:37.933 }, 00:26:37.933 { 00:26:37.933 "name": "BaseBdev2", 00:26:37.933 "uuid": "71c27226-8526-47e6-8c9b-6285ae2e9db9", 00:26:37.933 "is_configured": true, 00:26:37.933 "data_offset": 2048, 00:26:37.933 "data_size": 63488 00:26:37.933 }, 00:26:37.933 { 00:26:37.933 "name": "BaseBdev3", 00:26:37.933 "uuid": "3b3b6fee-f02d-4cd2-97c5-4d8173541347", 00:26:37.933 "is_configured": true, 00:26:37.933 "data_offset": 2048, 00:26:37.933 "data_size": 63488 00:26:37.933 }, 00:26:37.933 { 00:26:37.933 "name": "BaseBdev4", 00:26:37.933 "uuid": "33827cf7-ce73-4fe9-ab17-199dd89a012b", 00:26:37.933 "is_configured": true, 00:26:37.933 "data_offset": 2048, 00:26:37.933 "data_size": 63488 00:26:37.933 } 00:26:37.933 ] 00:26:37.933 }' 00:26:37.933 17:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.933 17:20:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:38.501 17:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:38.501 17:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:38.501 17:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.501 17:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:38.759 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:38.759 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:38.759 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:39.017 [2024-07-23 17:20:34.328049] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:39.017 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:39.017 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:39.017 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.017 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:39.276 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:39.276 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:39.276 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:26:39.535 [2024-07-23 17:20:34.832010] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:26:39.535 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:39.535 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:39.535 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.535 17:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:39.794 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:39.794 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:39.794 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:26:40.052 [2024-07-23 17:20:35.336100] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:26:40.052 [2024-07-23 17:20:35.336191] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:40.052 [2024-07-23 17:20:35.348965] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:40.052 [2024-07-23 17:20:35.349003] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:40.052 [2024-07-23 17:20:35.349014] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x93e990 name Existed_Raid, state offline 00:26:40.052 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:40.052 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:40.052 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.052 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:40.311 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:40.311 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:40.311 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:26:40.311 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:26:40.311 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:40.311 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:26:40.570 BaseBdev2 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:40.570 17:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:40.828 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:41.087 [ 00:26:41.087 { 00:26:41.087 "name": "BaseBdev2", 00:26:41.087 "aliases": [ 00:26:41.087 "201d52e7-ee88-4d91-85e3-d9244fb04d89" 00:26:41.087 ], 00:26:41.087 "product_name": "Malloc disk", 00:26:41.087 "block_size": 512, 00:26:41.087 "num_blocks": 65536, 00:26:41.087 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:41.087 "assigned_rate_limits": { 00:26:41.087 "rw_ios_per_sec": 0, 00:26:41.087 "rw_mbytes_per_sec": 0, 00:26:41.087 "r_mbytes_per_sec": 0, 00:26:41.087 "w_mbytes_per_sec": 0 00:26:41.087 }, 00:26:41.087 "claimed": false, 00:26:41.087 "zoned": false, 00:26:41.087 "supported_io_types": { 00:26:41.087 "read": true, 00:26:41.087 "write": true, 00:26:41.087 "unmap": true, 00:26:41.087 "flush": true, 00:26:41.087 "reset": true, 00:26:41.087 "nvme_admin": false, 00:26:41.087 "nvme_io": false, 00:26:41.087 "nvme_io_md": false, 00:26:41.087 "write_zeroes": true, 00:26:41.087 "zcopy": true, 00:26:41.087 "get_zone_info": false, 00:26:41.087 "zone_management": false, 00:26:41.087 "zone_append": false, 00:26:41.087 "compare": false, 00:26:41.087 "compare_and_write": false, 00:26:41.087 "abort": true, 00:26:41.087 "seek_hole": false, 00:26:41.087 "seek_data": false, 00:26:41.087 "copy": true, 00:26:41.087 "nvme_iov_md": false 00:26:41.087 }, 00:26:41.087 "memory_domains": [ 00:26:41.087 { 00:26:41.088 "dma_device_id": "system", 00:26:41.088 "dma_device_type": 1 00:26:41.088 }, 00:26:41.088 { 00:26:41.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:41.088 "dma_device_type": 2 00:26:41.088 } 00:26:41.088 ], 00:26:41.088 "driver_specific": {} 00:26:41.088 } 00:26:41.088 ] 00:26:41.088 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:41.088 17:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:41.088 17:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:41.088 17:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:26:41.346 BaseBdev3 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:41.346 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:41.605 17:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:26:41.864 [ 00:26:41.864 { 00:26:41.864 "name": "BaseBdev3", 00:26:41.864 "aliases": [ 00:26:41.864 "91638bff-acd2-40c3-ab14-5c221caa7b75" 00:26:41.864 ], 00:26:41.864 "product_name": "Malloc disk", 00:26:41.864 "block_size": 512, 00:26:41.864 "num_blocks": 65536, 00:26:41.864 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:41.864 "assigned_rate_limits": { 00:26:41.864 "rw_ios_per_sec": 0, 00:26:41.864 "rw_mbytes_per_sec": 0, 00:26:41.864 "r_mbytes_per_sec": 0, 00:26:41.864 "w_mbytes_per_sec": 0 00:26:41.864 }, 00:26:41.864 "claimed": false, 00:26:41.864 "zoned": false, 00:26:41.864 "supported_io_types": { 00:26:41.864 "read": true, 00:26:41.864 "write": true, 00:26:41.864 "unmap": true, 00:26:41.864 "flush": true, 00:26:41.864 "reset": true, 00:26:41.864 "nvme_admin": false, 00:26:41.864 "nvme_io": false, 00:26:41.864 "nvme_io_md": false, 00:26:41.864 "write_zeroes": true, 00:26:41.864 "zcopy": true, 00:26:41.864 "get_zone_info": false, 00:26:41.864 "zone_management": false, 00:26:41.864 "zone_append": false, 00:26:41.864 "compare": false, 00:26:41.864 "compare_and_write": false, 00:26:41.864 "abort": true, 00:26:41.864 "seek_hole": false, 00:26:41.864 "seek_data": false, 00:26:41.864 "copy": true, 00:26:41.864 "nvme_iov_md": false 00:26:41.864 }, 00:26:41.864 "memory_domains": [ 00:26:41.864 { 00:26:41.864 "dma_device_id": "system", 00:26:41.864 "dma_device_type": 1 00:26:41.864 }, 00:26:41.864 { 00:26:41.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:41.864 "dma_device_type": 2 00:26:41.864 } 00:26:41.864 ], 00:26:41.864 "driver_specific": {} 00:26:41.864 } 00:26:41.864 ] 00:26:41.864 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:41.864 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:41.864 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:41.864 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:26:42.123 BaseBdev4 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:42.123 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:42.382 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:26:42.382 [ 00:26:42.382 { 00:26:42.382 "name": "BaseBdev4", 00:26:42.382 "aliases": [ 00:26:42.382 "90d3404e-ca65-488f-827c-4000f9d15e65" 00:26:42.382 ], 00:26:42.382 "product_name": "Malloc disk", 00:26:42.382 "block_size": 512, 00:26:42.382 "num_blocks": 65536, 00:26:42.382 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:42.382 "assigned_rate_limits": { 00:26:42.382 "rw_ios_per_sec": 0, 00:26:42.382 "rw_mbytes_per_sec": 0, 00:26:42.382 "r_mbytes_per_sec": 0, 00:26:42.382 "w_mbytes_per_sec": 0 00:26:42.382 }, 00:26:42.382 "claimed": false, 00:26:42.382 "zoned": false, 00:26:42.382 "supported_io_types": { 00:26:42.382 "read": true, 00:26:42.382 "write": true, 00:26:42.382 "unmap": true, 00:26:42.382 "flush": true, 00:26:42.382 "reset": true, 00:26:42.382 "nvme_admin": false, 00:26:42.382 "nvme_io": false, 00:26:42.382 "nvme_io_md": false, 00:26:42.382 "write_zeroes": true, 00:26:42.382 "zcopy": true, 00:26:42.382 "get_zone_info": false, 00:26:42.382 "zone_management": false, 00:26:42.382 "zone_append": false, 00:26:42.382 "compare": false, 00:26:42.382 "compare_and_write": false, 00:26:42.382 "abort": true, 00:26:42.382 "seek_hole": false, 00:26:42.382 "seek_data": false, 00:26:42.382 "copy": true, 00:26:42.382 "nvme_iov_md": false 00:26:42.382 }, 00:26:42.382 "memory_domains": [ 00:26:42.382 { 00:26:42.382 "dma_device_id": "system", 00:26:42.382 "dma_device_type": 1 00:26:42.382 }, 00:26:42.382 { 00:26:42.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:42.382 "dma_device_type": 2 00:26:42.382 } 00:26:42.382 ], 00:26:42.382 "driver_specific": {} 00:26:42.382 } 00:26:42.382 ] 00:26:42.641 17:20:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:42.641 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:26:42.641 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:26:42.641 17:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:26:42.641 [2024-07-23 17:20:38.030261] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:42.641 [2024-07-23 17:20:38.030306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:42.641 [2024-07-23 17:20:38.030326] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:42.641 [2024-07-23 17:20:38.031638] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:42.641 [2024-07-23 17:20:38.031678] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.641 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:42.900 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.900 "name": "Existed_Raid", 00:26:42.900 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:42.900 "strip_size_kb": 0, 00:26:42.900 "state": "configuring", 00:26:42.900 "raid_level": "raid1", 00:26:42.900 "superblock": true, 00:26:42.900 "num_base_bdevs": 4, 00:26:42.900 "num_base_bdevs_discovered": 3, 00:26:42.900 "num_base_bdevs_operational": 4, 00:26:42.900 "base_bdevs_list": [ 00:26:42.900 { 00:26:42.900 "name": "BaseBdev1", 00:26:42.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.900 "is_configured": false, 00:26:42.900 "data_offset": 0, 00:26:42.900 "data_size": 0 00:26:42.900 }, 00:26:42.900 { 00:26:42.900 "name": "BaseBdev2", 00:26:42.900 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:42.900 "is_configured": true, 00:26:42.900 "data_offset": 2048, 00:26:42.900 "data_size": 63488 00:26:42.900 }, 00:26:42.900 { 00:26:42.900 "name": "BaseBdev3", 00:26:42.900 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:42.900 "is_configured": true, 00:26:42.900 "data_offset": 2048, 00:26:42.900 "data_size": 63488 00:26:42.900 }, 00:26:42.900 { 00:26:42.900 "name": "BaseBdev4", 00:26:42.900 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:42.900 "is_configured": true, 00:26:42.900 "data_offset": 2048, 00:26:42.900 "data_size": 63488 00:26:42.900 } 00:26:42.900 ] 00:26:42.900 }' 00:26:42.900 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.900 17:20:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:43.468 17:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:43.727 [2024-07-23 17:20:39.097032] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:43.727 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.986 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:43.986 "name": "Existed_Raid", 00:26:43.986 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:43.986 "strip_size_kb": 0, 00:26:43.986 "state": "configuring", 00:26:43.986 "raid_level": "raid1", 00:26:43.986 "superblock": true, 00:26:43.986 "num_base_bdevs": 4, 00:26:43.986 "num_base_bdevs_discovered": 2, 00:26:43.986 "num_base_bdevs_operational": 4, 00:26:43.986 "base_bdevs_list": [ 00:26:43.986 { 00:26:43.986 "name": "BaseBdev1", 00:26:43.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:43.986 "is_configured": false, 00:26:43.986 "data_offset": 0, 00:26:43.986 "data_size": 0 00:26:43.986 }, 00:26:43.986 { 00:26:43.986 "name": null, 00:26:43.986 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:43.986 "is_configured": false, 00:26:43.986 "data_offset": 2048, 00:26:43.986 "data_size": 63488 00:26:43.986 }, 00:26:43.986 { 00:26:43.986 "name": "BaseBdev3", 00:26:43.986 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:43.986 "is_configured": true, 00:26:43.986 "data_offset": 2048, 00:26:43.986 "data_size": 63488 00:26:43.986 }, 00:26:43.986 { 00:26:43.986 "name": "BaseBdev4", 00:26:43.986 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:43.986 "is_configured": true, 00:26:43.986 "data_offset": 2048, 00:26:43.986 "data_size": 63488 00:26:43.986 } 00:26:43.986 ] 00:26:43.986 }' 00:26:43.986 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:43.986 17:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:44.921 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.921 17:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:44.921 17:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:26:44.921 17:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:26:45.179 [2024-07-23 17:20:40.484148] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:45.179 BaseBdev1 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:45.179 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:45.438 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:45.696 [ 00:26:45.696 { 00:26:45.696 "name": "BaseBdev1", 00:26:45.696 "aliases": [ 00:26:45.696 "2424e0f5-8d1e-4483-a9b9-50634a672b8c" 00:26:45.696 ], 00:26:45.696 "product_name": "Malloc disk", 00:26:45.696 "block_size": 512, 00:26:45.696 "num_blocks": 65536, 00:26:45.697 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:45.697 "assigned_rate_limits": { 00:26:45.697 "rw_ios_per_sec": 0, 00:26:45.697 "rw_mbytes_per_sec": 0, 00:26:45.697 "r_mbytes_per_sec": 0, 00:26:45.697 "w_mbytes_per_sec": 0 00:26:45.697 }, 00:26:45.697 "claimed": true, 00:26:45.697 "claim_type": "exclusive_write", 00:26:45.697 "zoned": false, 00:26:45.697 "supported_io_types": { 00:26:45.697 "read": true, 00:26:45.697 "write": true, 00:26:45.697 "unmap": true, 00:26:45.697 "flush": true, 00:26:45.697 "reset": true, 00:26:45.697 "nvme_admin": false, 00:26:45.697 "nvme_io": false, 00:26:45.697 "nvme_io_md": false, 00:26:45.697 "write_zeroes": true, 00:26:45.697 "zcopy": true, 00:26:45.697 "get_zone_info": false, 00:26:45.697 "zone_management": false, 00:26:45.697 "zone_append": false, 00:26:45.697 "compare": false, 00:26:45.697 "compare_and_write": false, 00:26:45.697 "abort": true, 00:26:45.697 "seek_hole": false, 00:26:45.697 "seek_data": false, 00:26:45.697 "copy": true, 00:26:45.697 "nvme_iov_md": false 00:26:45.697 }, 00:26:45.697 "memory_domains": [ 00:26:45.697 { 00:26:45.697 "dma_device_id": "system", 00:26:45.697 "dma_device_type": 1 00:26:45.697 }, 00:26:45.697 { 00:26:45.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:45.697 "dma_device_type": 2 00:26:45.697 } 00:26:45.697 ], 00:26:45.697 "driver_specific": {} 00:26:45.697 } 00:26:45.697 ] 00:26:45.697 17:20:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:45.697 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.956 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.956 "name": "Existed_Raid", 00:26:45.956 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:45.956 "strip_size_kb": 0, 00:26:45.956 "state": "configuring", 00:26:45.956 "raid_level": "raid1", 00:26:45.956 "superblock": true, 00:26:45.956 "num_base_bdevs": 4, 00:26:45.956 "num_base_bdevs_discovered": 3, 00:26:45.956 "num_base_bdevs_operational": 4, 00:26:45.956 "base_bdevs_list": [ 00:26:45.956 { 00:26:45.956 "name": "BaseBdev1", 00:26:45.956 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:45.956 "is_configured": true, 00:26:45.956 "data_offset": 2048, 00:26:45.956 "data_size": 63488 00:26:45.956 }, 00:26:45.956 { 00:26:45.956 "name": null, 00:26:45.956 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:45.956 "is_configured": false, 00:26:45.956 "data_offset": 2048, 00:26:45.956 "data_size": 63488 00:26:45.956 }, 00:26:45.956 { 00:26:45.956 "name": "BaseBdev3", 00:26:45.956 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:45.956 "is_configured": true, 00:26:45.956 "data_offset": 2048, 00:26:45.956 "data_size": 63488 00:26:45.956 }, 00:26:45.956 { 00:26:45.956 "name": "BaseBdev4", 00:26:45.956 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:45.956 "is_configured": true, 00:26:45.956 "data_offset": 2048, 00:26:45.956 "data_size": 63488 00:26:45.956 } 00:26:45.956 ] 00:26:45.956 }' 00:26:45.956 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.956 17:20:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:46.523 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.523 17:20:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:26:46.781 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:26:46.781 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:26:47.040 [2024-07-23 17:20:42.325063] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:47.040 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.299 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:47.299 "name": "Existed_Raid", 00:26:47.299 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:47.299 "strip_size_kb": 0, 00:26:47.299 "state": "configuring", 00:26:47.299 "raid_level": "raid1", 00:26:47.299 "superblock": true, 00:26:47.299 "num_base_bdevs": 4, 00:26:47.299 "num_base_bdevs_discovered": 2, 00:26:47.299 "num_base_bdevs_operational": 4, 00:26:47.299 "base_bdevs_list": [ 00:26:47.299 { 00:26:47.299 "name": "BaseBdev1", 00:26:47.299 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:47.299 "is_configured": true, 00:26:47.299 "data_offset": 2048, 00:26:47.299 "data_size": 63488 00:26:47.299 }, 00:26:47.299 { 00:26:47.299 "name": null, 00:26:47.299 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:47.299 "is_configured": false, 00:26:47.299 "data_offset": 2048, 00:26:47.299 "data_size": 63488 00:26:47.299 }, 00:26:47.299 { 00:26:47.299 "name": null, 00:26:47.299 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:47.299 "is_configured": false, 00:26:47.299 "data_offset": 2048, 00:26:47.299 "data_size": 63488 00:26:47.299 }, 00:26:47.299 { 00:26:47.299 "name": "BaseBdev4", 00:26:47.299 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:47.299 "is_configured": true, 00:26:47.299 "data_offset": 2048, 00:26:47.299 "data_size": 63488 00:26:47.299 } 00:26:47.299 ] 00:26:47.299 }' 00:26:47.299 17:20:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:47.299 17:20:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:47.866 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.866 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:26:48.125 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:26:48.125 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:26:48.383 [2024-07-23 17:20:43.668644] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.383 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:48.643 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.643 "name": "Existed_Raid", 00:26:48.643 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:48.643 "strip_size_kb": 0, 00:26:48.643 "state": "configuring", 00:26:48.643 "raid_level": "raid1", 00:26:48.643 "superblock": true, 00:26:48.643 "num_base_bdevs": 4, 00:26:48.643 "num_base_bdevs_discovered": 3, 00:26:48.643 "num_base_bdevs_operational": 4, 00:26:48.643 "base_bdevs_list": [ 00:26:48.643 { 00:26:48.643 "name": "BaseBdev1", 00:26:48.643 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:48.643 "is_configured": true, 00:26:48.643 "data_offset": 2048, 00:26:48.643 "data_size": 63488 00:26:48.643 }, 00:26:48.643 { 00:26:48.643 "name": null, 00:26:48.643 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:48.643 "is_configured": false, 00:26:48.643 "data_offset": 2048, 00:26:48.643 "data_size": 63488 00:26:48.643 }, 00:26:48.643 { 00:26:48.643 "name": "BaseBdev3", 00:26:48.643 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:48.643 "is_configured": true, 00:26:48.643 "data_offset": 2048, 00:26:48.643 "data_size": 63488 00:26:48.643 }, 00:26:48.643 { 00:26:48.643 "name": "BaseBdev4", 00:26:48.643 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:48.643 "is_configured": true, 00:26:48.643 "data_offset": 2048, 00:26:48.643 "data_size": 63488 00:26:48.643 } 00:26:48.643 ] 00:26:48.643 }' 00:26:48.643 17:20:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.643 17:20:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:49.252 17:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.252 17:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:26:49.511 17:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:26:49.511 17:20:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:49.771 [2024-07-23 17:20:44.996189] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.771 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:50.338 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.338 "name": "Existed_Raid", 00:26:50.338 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:50.338 "strip_size_kb": 0, 00:26:50.338 "state": "configuring", 00:26:50.338 "raid_level": "raid1", 00:26:50.338 "superblock": true, 00:26:50.338 "num_base_bdevs": 4, 00:26:50.338 "num_base_bdevs_discovered": 2, 00:26:50.338 "num_base_bdevs_operational": 4, 00:26:50.338 "base_bdevs_list": [ 00:26:50.338 { 00:26:50.338 "name": null, 00:26:50.338 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:50.338 "is_configured": false, 00:26:50.338 "data_offset": 2048, 00:26:50.338 "data_size": 63488 00:26:50.338 }, 00:26:50.338 { 00:26:50.338 "name": null, 00:26:50.338 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:50.338 "is_configured": false, 00:26:50.338 "data_offset": 2048, 00:26:50.338 "data_size": 63488 00:26:50.338 }, 00:26:50.338 { 00:26:50.338 "name": "BaseBdev3", 00:26:50.338 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:50.338 "is_configured": true, 00:26:50.338 "data_offset": 2048, 00:26:50.338 "data_size": 63488 00:26:50.338 }, 00:26:50.338 { 00:26:50.338 "name": "BaseBdev4", 00:26:50.338 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:50.338 "is_configured": true, 00:26:50.338 "data_offset": 2048, 00:26:50.338 "data_size": 63488 00:26:50.338 } 00:26:50.338 ] 00:26:50.338 }' 00:26:50.338 17:20:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.338 17:20:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:51.275 17:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.275 17:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:26:51.534 17:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:26:51.534 17:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:26:51.794 [2024-07-23 17:20:47.001909] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.794 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:52.053 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:52.053 "name": "Existed_Raid", 00:26:52.053 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:52.053 "strip_size_kb": 0, 00:26:52.053 "state": "configuring", 00:26:52.053 "raid_level": "raid1", 00:26:52.053 "superblock": true, 00:26:52.053 "num_base_bdevs": 4, 00:26:52.053 "num_base_bdevs_discovered": 3, 00:26:52.053 "num_base_bdevs_operational": 4, 00:26:52.053 "base_bdevs_list": [ 00:26:52.053 { 00:26:52.053 "name": null, 00:26:52.053 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:52.053 "is_configured": false, 00:26:52.053 "data_offset": 2048, 00:26:52.053 "data_size": 63488 00:26:52.053 }, 00:26:52.053 { 00:26:52.053 "name": "BaseBdev2", 00:26:52.053 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:52.053 "is_configured": true, 00:26:52.053 "data_offset": 2048, 00:26:52.053 "data_size": 63488 00:26:52.053 }, 00:26:52.053 { 00:26:52.053 "name": "BaseBdev3", 00:26:52.053 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:52.053 "is_configured": true, 00:26:52.053 "data_offset": 2048, 00:26:52.053 "data_size": 63488 00:26:52.053 }, 00:26:52.053 { 00:26:52.053 "name": "BaseBdev4", 00:26:52.053 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:52.053 "is_configured": true, 00:26:52.053 "data_offset": 2048, 00:26:52.053 "data_size": 63488 00:26:52.053 } 00:26:52.053 ] 00:26:52.053 }' 00:26:52.053 17:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:52.053 17:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:52.990 17:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.990 17:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:26:52.990 17:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:26:52.990 17:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.990 17:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:26:53.558 17:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2424e0f5-8d1e-4483-a9b9-50634a672b8c 00:26:53.818 [2024-07-23 17:20:49.116078] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:26:53.818 [2024-07-23 17:20:49.116250] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xaf2c50 00:26:53.818 [2024-07-23 17:20:49.116264] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:53.818 [2024-07-23 17:20:49.116447] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaf38e0 00:26:53.818 [2024-07-23 17:20:49.116571] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaf2c50 00:26:53.818 [2024-07-23 17:20:49.116581] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xaf2c50 00:26:53.818 [2024-07-23 17:20:49.116672] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:53.818 NewBaseBdev 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:53.818 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:54.385 17:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:26:54.953 [ 00:26:54.953 { 00:26:54.953 "name": "NewBaseBdev", 00:26:54.953 "aliases": [ 00:26:54.953 "2424e0f5-8d1e-4483-a9b9-50634a672b8c" 00:26:54.953 ], 00:26:54.953 "product_name": "Malloc disk", 00:26:54.953 "block_size": 512, 00:26:54.953 "num_blocks": 65536, 00:26:54.953 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:54.953 "assigned_rate_limits": { 00:26:54.953 "rw_ios_per_sec": 0, 00:26:54.953 "rw_mbytes_per_sec": 0, 00:26:54.953 "r_mbytes_per_sec": 0, 00:26:54.953 "w_mbytes_per_sec": 0 00:26:54.953 }, 00:26:54.953 "claimed": true, 00:26:54.953 "claim_type": "exclusive_write", 00:26:54.953 "zoned": false, 00:26:54.953 "supported_io_types": { 00:26:54.953 "read": true, 00:26:54.953 "write": true, 00:26:54.953 "unmap": true, 00:26:54.953 "flush": true, 00:26:54.953 "reset": true, 00:26:54.953 "nvme_admin": false, 00:26:54.953 "nvme_io": false, 00:26:54.953 "nvme_io_md": false, 00:26:54.953 "write_zeroes": true, 00:26:54.953 "zcopy": true, 00:26:54.953 "get_zone_info": false, 00:26:54.953 "zone_management": false, 00:26:54.953 "zone_append": false, 00:26:54.953 "compare": false, 00:26:54.953 "compare_and_write": false, 00:26:54.953 "abort": true, 00:26:54.953 "seek_hole": false, 00:26:54.953 "seek_data": false, 00:26:54.953 "copy": true, 00:26:54.953 "nvme_iov_md": false 00:26:54.953 }, 00:26:54.953 "memory_domains": [ 00:26:54.953 { 00:26:54.953 "dma_device_id": "system", 00:26:54.953 "dma_device_type": 1 00:26:54.953 }, 00:26:54.953 { 00:26:54.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.953 "dma_device_type": 2 00:26:54.953 } 00:26:54.953 ], 00:26:54.953 "driver_specific": {} 00:26:54.953 } 00:26:54.953 ] 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.953 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:55.211 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.211 "name": "Existed_Raid", 00:26:55.211 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:55.211 "strip_size_kb": 0, 00:26:55.211 "state": "online", 00:26:55.211 "raid_level": "raid1", 00:26:55.211 "superblock": true, 00:26:55.211 "num_base_bdevs": 4, 00:26:55.211 "num_base_bdevs_discovered": 4, 00:26:55.211 "num_base_bdevs_operational": 4, 00:26:55.211 "base_bdevs_list": [ 00:26:55.211 { 00:26:55.211 "name": "NewBaseBdev", 00:26:55.211 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:55.211 "is_configured": true, 00:26:55.211 "data_offset": 2048, 00:26:55.211 "data_size": 63488 00:26:55.211 }, 00:26:55.211 { 00:26:55.211 "name": "BaseBdev2", 00:26:55.211 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:55.212 "is_configured": true, 00:26:55.212 "data_offset": 2048, 00:26:55.212 "data_size": 63488 00:26:55.212 }, 00:26:55.212 { 00:26:55.212 "name": "BaseBdev3", 00:26:55.212 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:55.212 "is_configured": true, 00:26:55.212 "data_offset": 2048, 00:26:55.212 "data_size": 63488 00:26:55.212 }, 00:26:55.212 { 00:26:55.212 "name": "BaseBdev4", 00:26:55.212 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:55.212 "is_configured": true, 00:26:55.212 "data_offset": 2048, 00:26:55.212 "data_size": 63488 00:26:55.212 } 00:26:55.212 ] 00:26:55.212 }' 00:26:55.212 17:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.212 17:20:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:56.147 [2024-07-23 17:20:51.514764] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:56.147 "name": "Existed_Raid", 00:26:56.147 "aliases": [ 00:26:56.147 "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0" 00:26:56.147 ], 00:26:56.147 "product_name": "Raid Volume", 00:26:56.147 "block_size": 512, 00:26:56.147 "num_blocks": 63488, 00:26:56.147 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:56.147 "assigned_rate_limits": { 00:26:56.147 "rw_ios_per_sec": 0, 00:26:56.147 "rw_mbytes_per_sec": 0, 00:26:56.147 "r_mbytes_per_sec": 0, 00:26:56.147 "w_mbytes_per_sec": 0 00:26:56.147 }, 00:26:56.147 "claimed": false, 00:26:56.147 "zoned": false, 00:26:56.147 "supported_io_types": { 00:26:56.147 "read": true, 00:26:56.147 "write": true, 00:26:56.147 "unmap": false, 00:26:56.147 "flush": false, 00:26:56.147 "reset": true, 00:26:56.147 "nvme_admin": false, 00:26:56.147 "nvme_io": false, 00:26:56.147 "nvme_io_md": false, 00:26:56.147 "write_zeroes": true, 00:26:56.147 "zcopy": false, 00:26:56.147 "get_zone_info": false, 00:26:56.147 "zone_management": false, 00:26:56.147 "zone_append": false, 00:26:56.147 "compare": false, 00:26:56.147 "compare_and_write": false, 00:26:56.147 "abort": false, 00:26:56.147 "seek_hole": false, 00:26:56.147 "seek_data": false, 00:26:56.147 "copy": false, 00:26:56.147 "nvme_iov_md": false 00:26:56.147 }, 00:26:56.147 "memory_domains": [ 00:26:56.147 { 00:26:56.147 "dma_device_id": "system", 00:26:56.147 "dma_device_type": 1 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:56.147 "dma_device_type": 2 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "system", 00:26:56.147 "dma_device_type": 1 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:56.147 "dma_device_type": 2 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "system", 00:26:56.147 "dma_device_type": 1 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:56.147 "dma_device_type": 2 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "system", 00:26:56.147 "dma_device_type": 1 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:56.147 "dma_device_type": 2 00:26:56.147 } 00:26:56.147 ], 00:26:56.147 "driver_specific": { 00:26:56.147 "raid": { 00:26:56.147 "uuid": "8affe2bc-7ab3-4eaa-bbc4-34f3d3da82c0", 00:26:56.147 "strip_size_kb": 0, 00:26:56.147 "state": "online", 00:26:56.147 "raid_level": "raid1", 00:26:56.147 "superblock": true, 00:26:56.147 "num_base_bdevs": 4, 00:26:56.147 "num_base_bdevs_discovered": 4, 00:26:56.147 "num_base_bdevs_operational": 4, 00:26:56.147 "base_bdevs_list": [ 00:26:56.147 { 00:26:56.147 "name": "NewBaseBdev", 00:26:56.147 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:56.147 "is_configured": true, 00:26:56.147 "data_offset": 2048, 00:26:56.147 "data_size": 63488 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "name": "BaseBdev2", 00:26:56.147 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:56.147 "is_configured": true, 00:26:56.147 "data_offset": 2048, 00:26:56.147 "data_size": 63488 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "name": "BaseBdev3", 00:26:56.147 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:56.147 "is_configured": true, 00:26:56.147 "data_offset": 2048, 00:26:56.147 "data_size": 63488 00:26:56.147 }, 00:26:56.147 { 00:26:56.147 "name": "BaseBdev4", 00:26:56.147 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:56.147 "is_configured": true, 00:26:56.147 "data_offset": 2048, 00:26:56.147 "data_size": 63488 00:26:56.147 } 00:26:56.147 ] 00:26:56.147 } 00:26:56.147 } 00:26:56.147 }' 00:26:56.147 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:56.405 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:26:56.405 BaseBdev2 00:26:56.405 BaseBdev3 00:26:56.405 BaseBdev4' 00:26:56.405 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:56.405 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:26:56.405 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:56.664 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:56.664 "name": "NewBaseBdev", 00:26:56.664 "aliases": [ 00:26:56.664 "2424e0f5-8d1e-4483-a9b9-50634a672b8c" 00:26:56.664 ], 00:26:56.664 "product_name": "Malloc disk", 00:26:56.664 "block_size": 512, 00:26:56.664 "num_blocks": 65536, 00:26:56.664 "uuid": "2424e0f5-8d1e-4483-a9b9-50634a672b8c", 00:26:56.664 "assigned_rate_limits": { 00:26:56.664 "rw_ios_per_sec": 0, 00:26:56.664 "rw_mbytes_per_sec": 0, 00:26:56.664 "r_mbytes_per_sec": 0, 00:26:56.664 "w_mbytes_per_sec": 0 00:26:56.664 }, 00:26:56.664 "claimed": true, 00:26:56.664 "claim_type": "exclusive_write", 00:26:56.664 "zoned": false, 00:26:56.664 "supported_io_types": { 00:26:56.664 "read": true, 00:26:56.664 "write": true, 00:26:56.664 "unmap": true, 00:26:56.664 "flush": true, 00:26:56.664 "reset": true, 00:26:56.664 "nvme_admin": false, 00:26:56.664 "nvme_io": false, 00:26:56.664 "nvme_io_md": false, 00:26:56.664 "write_zeroes": true, 00:26:56.664 "zcopy": true, 00:26:56.664 "get_zone_info": false, 00:26:56.664 "zone_management": false, 00:26:56.664 "zone_append": false, 00:26:56.664 "compare": false, 00:26:56.664 "compare_and_write": false, 00:26:56.664 "abort": true, 00:26:56.664 "seek_hole": false, 00:26:56.664 "seek_data": false, 00:26:56.664 "copy": true, 00:26:56.665 "nvme_iov_md": false 00:26:56.665 }, 00:26:56.665 "memory_domains": [ 00:26:56.665 { 00:26:56.665 "dma_device_id": "system", 00:26:56.665 "dma_device_type": 1 00:26:56.665 }, 00:26:56.665 { 00:26:56.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:56.665 "dma_device_type": 2 00:26:56.665 } 00:26:56.665 ], 00:26:56.665 "driver_specific": {} 00:26:56.665 }' 00:26:56.665 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:56.665 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:56.665 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:56.665 17:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:56.665 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:56.665 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:56.665 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:56.923 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:57.180 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:57.180 "name": "BaseBdev2", 00:26:57.181 "aliases": [ 00:26:57.181 "201d52e7-ee88-4d91-85e3-d9244fb04d89" 00:26:57.181 ], 00:26:57.181 "product_name": "Malloc disk", 00:26:57.181 "block_size": 512, 00:26:57.181 "num_blocks": 65536, 00:26:57.181 "uuid": "201d52e7-ee88-4d91-85e3-d9244fb04d89", 00:26:57.181 "assigned_rate_limits": { 00:26:57.181 "rw_ios_per_sec": 0, 00:26:57.181 "rw_mbytes_per_sec": 0, 00:26:57.181 "r_mbytes_per_sec": 0, 00:26:57.181 "w_mbytes_per_sec": 0 00:26:57.181 }, 00:26:57.181 "claimed": true, 00:26:57.181 "claim_type": "exclusive_write", 00:26:57.181 "zoned": false, 00:26:57.181 "supported_io_types": { 00:26:57.181 "read": true, 00:26:57.181 "write": true, 00:26:57.181 "unmap": true, 00:26:57.181 "flush": true, 00:26:57.181 "reset": true, 00:26:57.181 "nvme_admin": false, 00:26:57.181 "nvme_io": false, 00:26:57.181 "nvme_io_md": false, 00:26:57.181 "write_zeroes": true, 00:26:57.181 "zcopy": true, 00:26:57.181 "get_zone_info": false, 00:26:57.181 "zone_management": false, 00:26:57.181 "zone_append": false, 00:26:57.181 "compare": false, 00:26:57.181 "compare_and_write": false, 00:26:57.181 "abort": true, 00:26:57.181 "seek_hole": false, 00:26:57.181 "seek_data": false, 00:26:57.181 "copy": true, 00:26:57.181 "nvme_iov_md": false 00:26:57.181 }, 00:26:57.181 "memory_domains": [ 00:26:57.181 { 00:26:57.181 "dma_device_id": "system", 00:26:57.181 "dma_device_type": 1 00:26:57.181 }, 00:26:57.181 { 00:26:57.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:57.181 "dma_device_type": 2 00:26:57.181 } 00:26:57.181 ], 00:26:57.181 "driver_specific": {} 00:26:57.181 }' 00:26:57.181 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:57.181 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:57.181 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:57.181 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:57.181 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:57.438 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:57.438 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:57.438 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:57.438 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:57.438 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:57.438 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:57.439 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:57.439 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:57.439 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:26:57.439 17:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:57.697 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:57.697 "name": "BaseBdev3", 00:26:57.697 "aliases": [ 00:26:57.697 "91638bff-acd2-40c3-ab14-5c221caa7b75" 00:26:57.697 ], 00:26:57.697 "product_name": "Malloc disk", 00:26:57.697 "block_size": 512, 00:26:57.697 "num_blocks": 65536, 00:26:57.697 "uuid": "91638bff-acd2-40c3-ab14-5c221caa7b75", 00:26:57.697 "assigned_rate_limits": { 00:26:57.697 "rw_ios_per_sec": 0, 00:26:57.697 "rw_mbytes_per_sec": 0, 00:26:57.697 "r_mbytes_per_sec": 0, 00:26:57.697 "w_mbytes_per_sec": 0 00:26:57.697 }, 00:26:57.697 "claimed": true, 00:26:57.697 "claim_type": "exclusive_write", 00:26:57.697 "zoned": false, 00:26:57.697 "supported_io_types": { 00:26:57.697 "read": true, 00:26:57.697 "write": true, 00:26:57.697 "unmap": true, 00:26:57.697 "flush": true, 00:26:57.697 "reset": true, 00:26:57.697 "nvme_admin": false, 00:26:57.697 "nvme_io": false, 00:26:57.697 "nvme_io_md": false, 00:26:57.697 "write_zeroes": true, 00:26:57.697 "zcopy": true, 00:26:57.697 "get_zone_info": false, 00:26:57.697 "zone_management": false, 00:26:57.697 "zone_append": false, 00:26:57.697 "compare": false, 00:26:57.697 "compare_and_write": false, 00:26:57.697 "abort": true, 00:26:57.697 "seek_hole": false, 00:26:57.697 "seek_data": false, 00:26:57.697 "copy": true, 00:26:57.697 "nvme_iov_md": false 00:26:57.697 }, 00:26:57.697 "memory_domains": [ 00:26:57.697 { 00:26:57.697 "dma_device_id": "system", 00:26:57.697 "dma_device_type": 1 00:26:57.697 }, 00:26:57.697 { 00:26:57.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:57.697 "dma_device_type": 2 00:26:57.697 } 00:26:57.697 ], 00:26:57.697 "driver_specific": {} 00:26:57.697 }' 00:26:57.697 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:57.697 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:57.955 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.213 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.213 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:58.214 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:58.214 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:26:58.214 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:58.472 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:58.472 "name": "BaseBdev4", 00:26:58.472 "aliases": [ 00:26:58.472 "90d3404e-ca65-488f-827c-4000f9d15e65" 00:26:58.472 ], 00:26:58.472 "product_name": "Malloc disk", 00:26:58.472 "block_size": 512, 00:26:58.472 "num_blocks": 65536, 00:26:58.472 "uuid": "90d3404e-ca65-488f-827c-4000f9d15e65", 00:26:58.472 "assigned_rate_limits": { 00:26:58.472 "rw_ios_per_sec": 0, 00:26:58.472 "rw_mbytes_per_sec": 0, 00:26:58.472 "r_mbytes_per_sec": 0, 00:26:58.472 "w_mbytes_per_sec": 0 00:26:58.472 }, 00:26:58.472 "claimed": true, 00:26:58.472 "claim_type": "exclusive_write", 00:26:58.472 "zoned": false, 00:26:58.472 "supported_io_types": { 00:26:58.472 "read": true, 00:26:58.472 "write": true, 00:26:58.472 "unmap": true, 00:26:58.472 "flush": true, 00:26:58.472 "reset": true, 00:26:58.472 "nvme_admin": false, 00:26:58.472 "nvme_io": false, 00:26:58.472 "nvme_io_md": false, 00:26:58.472 "write_zeroes": true, 00:26:58.472 "zcopy": true, 00:26:58.472 "get_zone_info": false, 00:26:58.472 "zone_management": false, 00:26:58.472 "zone_append": false, 00:26:58.472 "compare": false, 00:26:58.472 "compare_and_write": false, 00:26:58.472 "abort": true, 00:26:58.472 "seek_hole": false, 00:26:58.472 "seek_data": false, 00:26:58.472 "copy": true, 00:26:58.472 "nvme_iov_md": false 00:26:58.472 }, 00:26:58.472 "memory_domains": [ 00:26:58.472 { 00:26:58.472 "dma_device_id": "system", 00:26:58.472 "dma_device_type": 1 00:26:58.472 }, 00:26:58.472 { 00:26:58.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:58.473 "dma_device_type": 2 00:26:58.473 } 00:26:58.473 ], 00:26:58.473 "driver_specific": {} 00:26:58.473 }' 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:58.473 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:58.731 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:58.731 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:58.731 17:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.731 17:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:58.731 17:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:58.731 17:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:58.990 [2024-07-23 17:20:54.293825] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:58.990 [2024-07-23 17:20:54.293854] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:58.990 [2024-07-23 17:20:54.293929] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:58.990 [2024-07-23 17:20:54.294194] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:58.990 [2024-07-23 17:20:54.294206] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaf2c50 name Existed_Raid, state offline 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 15012 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 15012 ']' 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 15012 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 15012 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 15012' 00:26:58.990 killing process with pid 15012 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 15012 00:26:58.990 [2024-07-23 17:20:54.361223] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:58.990 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 15012 00:26:58.990 [2024-07-23 17:20:54.398067] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:59.249 17:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:26:59.249 00:26:59.249 real 0m33.853s 00:26:59.249 user 1m2.528s 00:26:59.249 sys 0m6.188s 00:26:59.249 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:59.249 17:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:59.249 ************************************ 00:26:59.249 END TEST raid_state_function_test_sb 00:26:59.249 ************************************ 00:26:59.249 17:20:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:59.249 17:20:54 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:26:59.249 17:20:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:59.249 17:20:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:59.249 17:20:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:59.508 ************************************ 00:26:59.508 START TEST raid_superblock_test 00:26:59.508 ************************************ 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=20054 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 20054 /var/tmp/spdk-raid.sock 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 20054 ']' 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:59.508 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:59.508 17:20:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:26:59.508 [2024-07-23 17:20:54.788307] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:26:59.508 [2024-07-23 17:20:54.788448] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid20054 ] 00:26:59.767 [2024-07-23 17:20:54.985569] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:59.767 [2024-07-23 17:20:55.038448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:59.767 [2024-07-23 17:20:55.106805] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:59.767 [2024-07-23 17:20:55.106864] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:00.703 17:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:27:00.962 malloc1 00:27:00.962 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:01.220 [2024-07-23 17:20:56.442729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:01.220 [2024-07-23 17:20:56.442776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:01.220 [2024-07-23 17:20:56.442796] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6e070 00:27:01.220 [2024-07-23 17:20:56.442808] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:01.221 [2024-07-23 17:20:56.444337] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:01.221 [2024-07-23 17:20:56.444364] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:01.221 pt1 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:01.221 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:27:01.479 malloc2 00:27:01.479 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:01.738 [2024-07-23 17:20:56.962288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:01.738 [2024-07-23 17:20:56.962336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:01.738 [2024-07-23 17:20:56.962354] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a54920 00:27:01.738 [2024-07-23 17:20:56.962366] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:01.738 [2024-07-23 17:20:56.963957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:01.738 [2024-07-23 17:20:56.963985] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:01.738 pt2 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:01.738 17:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:27:01.997 malloc3 00:27:01.997 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:27:02.256 [2024-07-23 17:20:57.481483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:27:02.256 [2024-07-23 17:20:57.481530] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.256 [2024-07-23 17:20:57.481548] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b663e0 00:27:02.256 [2024-07-23 17:20:57.481560] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.256 [2024-07-23 17:20:57.483120] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.256 [2024-07-23 17:20:57.483147] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:27:02.256 pt3 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:02.256 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:27:02.514 malloc4 00:27:02.514 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:27:02.773 [2024-07-23 17:20:57.980688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:27:02.773 [2024-07-23 17:20:57.980732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.773 [2024-07-23 17:20:57.980749] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b68870 00:27:02.773 [2024-07-23 17:20:57.980767] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.773 [2024-07-23 17:20:57.982277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.773 [2024-07-23 17:20:57.982304] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:27:02.773 pt4 00:27:02.773 17:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:02.773 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:02.773 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:27:03.063 [2024-07-23 17:20:58.225359] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:03.063 [2024-07-23 17:20:58.226641] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:03.063 [2024-07-23 17:20:58.226694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:27:03.063 [2024-07-23 17:20:58.226737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:27:03.063 [2024-07-23 17:20:58.226918] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b69e80 00:27:03.063 [2024-07-23 17:20:58.226931] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:03.063 [2024-07-23 17:20:58.227128] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b66670 00:27:03.063 [2024-07-23 17:20:58.227277] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b69e80 00:27:03.063 [2024-07-23 17:20:58.227288] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b69e80 00:27:03.063 [2024-07-23 17:20:58.227384] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.064 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.322 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:03.322 "name": "raid_bdev1", 00:27:03.322 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:03.322 "strip_size_kb": 0, 00:27:03.322 "state": "online", 00:27:03.322 "raid_level": "raid1", 00:27:03.322 "superblock": true, 00:27:03.322 "num_base_bdevs": 4, 00:27:03.322 "num_base_bdevs_discovered": 4, 00:27:03.322 "num_base_bdevs_operational": 4, 00:27:03.322 "base_bdevs_list": [ 00:27:03.322 { 00:27:03.322 "name": "pt1", 00:27:03.322 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:03.322 "is_configured": true, 00:27:03.322 "data_offset": 2048, 00:27:03.322 "data_size": 63488 00:27:03.322 }, 00:27:03.322 { 00:27:03.322 "name": "pt2", 00:27:03.322 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:03.322 "is_configured": true, 00:27:03.322 "data_offset": 2048, 00:27:03.322 "data_size": 63488 00:27:03.322 }, 00:27:03.322 { 00:27:03.322 "name": "pt3", 00:27:03.322 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:03.322 "is_configured": true, 00:27:03.322 "data_offset": 2048, 00:27:03.322 "data_size": 63488 00:27:03.322 }, 00:27:03.322 { 00:27:03.322 "name": "pt4", 00:27:03.322 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:03.322 "is_configured": true, 00:27:03.322 "data_offset": 2048, 00:27:03.322 "data_size": 63488 00:27:03.322 } 00:27:03.322 ] 00:27:03.322 }' 00:27:03.322 17:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:03.322 17:20:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:03.955 [2024-07-23 17:20:59.232302] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:03.955 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:03.955 "name": "raid_bdev1", 00:27:03.955 "aliases": [ 00:27:03.955 "6c721821-2f36-4b7a-ab5c-c9e13ee1db44" 00:27:03.955 ], 00:27:03.955 "product_name": "Raid Volume", 00:27:03.955 "block_size": 512, 00:27:03.955 "num_blocks": 63488, 00:27:03.955 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:03.955 "assigned_rate_limits": { 00:27:03.955 "rw_ios_per_sec": 0, 00:27:03.955 "rw_mbytes_per_sec": 0, 00:27:03.955 "r_mbytes_per_sec": 0, 00:27:03.955 "w_mbytes_per_sec": 0 00:27:03.955 }, 00:27:03.955 "claimed": false, 00:27:03.955 "zoned": false, 00:27:03.955 "supported_io_types": { 00:27:03.955 "read": true, 00:27:03.955 "write": true, 00:27:03.955 "unmap": false, 00:27:03.955 "flush": false, 00:27:03.955 "reset": true, 00:27:03.955 "nvme_admin": false, 00:27:03.955 "nvme_io": false, 00:27:03.955 "nvme_io_md": false, 00:27:03.955 "write_zeroes": true, 00:27:03.955 "zcopy": false, 00:27:03.955 "get_zone_info": false, 00:27:03.955 "zone_management": false, 00:27:03.955 "zone_append": false, 00:27:03.955 "compare": false, 00:27:03.955 "compare_and_write": false, 00:27:03.955 "abort": false, 00:27:03.955 "seek_hole": false, 00:27:03.955 "seek_data": false, 00:27:03.955 "copy": false, 00:27:03.955 "nvme_iov_md": false 00:27:03.955 }, 00:27:03.955 "memory_domains": [ 00:27:03.955 { 00:27:03.955 "dma_device_id": "system", 00:27:03.955 "dma_device_type": 1 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:03.956 "dma_device_type": 2 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "system", 00:27:03.956 "dma_device_type": 1 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:03.956 "dma_device_type": 2 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "system", 00:27:03.956 "dma_device_type": 1 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:03.956 "dma_device_type": 2 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "system", 00:27:03.956 "dma_device_type": 1 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:03.956 "dma_device_type": 2 00:27:03.956 } 00:27:03.956 ], 00:27:03.956 "driver_specific": { 00:27:03.956 "raid": { 00:27:03.956 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:03.956 "strip_size_kb": 0, 00:27:03.956 "state": "online", 00:27:03.956 "raid_level": "raid1", 00:27:03.956 "superblock": true, 00:27:03.956 "num_base_bdevs": 4, 00:27:03.956 "num_base_bdevs_discovered": 4, 00:27:03.956 "num_base_bdevs_operational": 4, 00:27:03.956 "base_bdevs_list": [ 00:27:03.956 { 00:27:03.956 "name": "pt1", 00:27:03.956 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:03.956 "is_configured": true, 00:27:03.956 "data_offset": 2048, 00:27:03.956 "data_size": 63488 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "name": "pt2", 00:27:03.956 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:03.956 "is_configured": true, 00:27:03.956 "data_offset": 2048, 00:27:03.956 "data_size": 63488 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "name": "pt3", 00:27:03.956 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:03.956 "is_configured": true, 00:27:03.956 "data_offset": 2048, 00:27:03.956 "data_size": 63488 00:27:03.956 }, 00:27:03.956 { 00:27:03.956 "name": "pt4", 00:27:03.956 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:03.956 "is_configured": true, 00:27:03.956 "data_offset": 2048, 00:27:03.956 "data_size": 63488 00:27:03.956 } 00:27:03.956 ] 00:27:03.956 } 00:27:03.956 } 00:27:03.956 }' 00:27:03.956 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:03.956 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:03.956 pt2 00:27:03.956 pt3 00:27:03.956 pt4' 00:27:03.956 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:03.956 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:03.956 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:04.215 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:04.215 "name": "pt1", 00:27:04.215 "aliases": [ 00:27:04.215 "00000000-0000-0000-0000-000000000001" 00:27:04.215 ], 00:27:04.215 "product_name": "passthru", 00:27:04.215 "block_size": 512, 00:27:04.215 "num_blocks": 65536, 00:27:04.215 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:04.215 "assigned_rate_limits": { 00:27:04.215 "rw_ios_per_sec": 0, 00:27:04.215 "rw_mbytes_per_sec": 0, 00:27:04.215 "r_mbytes_per_sec": 0, 00:27:04.215 "w_mbytes_per_sec": 0 00:27:04.215 }, 00:27:04.215 "claimed": true, 00:27:04.215 "claim_type": "exclusive_write", 00:27:04.215 "zoned": false, 00:27:04.215 "supported_io_types": { 00:27:04.215 "read": true, 00:27:04.215 "write": true, 00:27:04.215 "unmap": true, 00:27:04.215 "flush": true, 00:27:04.215 "reset": true, 00:27:04.215 "nvme_admin": false, 00:27:04.215 "nvme_io": false, 00:27:04.215 "nvme_io_md": false, 00:27:04.215 "write_zeroes": true, 00:27:04.215 "zcopy": true, 00:27:04.215 "get_zone_info": false, 00:27:04.215 "zone_management": false, 00:27:04.215 "zone_append": false, 00:27:04.215 "compare": false, 00:27:04.215 "compare_and_write": false, 00:27:04.215 "abort": true, 00:27:04.215 "seek_hole": false, 00:27:04.215 "seek_data": false, 00:27:04.215 "copy": true, 00:27:04.215 "nvme_iov_md": false 00:27:04.215 }, 00:27:04.215 "memory_domains": [ 00:27:04.215 { 00:27:04.215 "dma_device_id": "system", 00:27:04.215 "dma_device_type": 1 00:27:04.215 }, 00:27:04.215 { 00:27:04.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:04.215 "dma_device_type": 2 00:27:04.215 } 00:27:04.215 ], 00:27:04.215 "driver_specific": { 00:27:04.215 "passthru": { 00:27:04.215 "name": "pt1", 00:27:04.215 "base_bdev_name": "malloc1" 00:27:04.215 } 00:27:04.215 } 00:27:04.215 }' 00:27:04.215 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:04.215 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:04.474 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:04.733 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:04.733 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:04.733 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:04.733 17:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:04.991 "name": "pt2", 00:27:04.991 "aliases": [ 00:27:04.991 "00000000-0000-0000-0000-000000000002" 00:27:04.991 ], 00:27:04.991 "product_name": "passthru", 00:27:04.991 "block_size": 512, 00:27:04.991 "num_blocks": 65536, 00:27:04.991 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:04.991 "assigned_rate_limits": { 00:27:04.991 "rw_ios_per_sec": 0, 00:27:04.991 "rw_mbytes_per_sec": 0, 00:27:04.991 "r_mbytes_per_sec": 0, 00:27:04.991 "w_mbytes_per_sec": 0 00:27:04.991 }, 00:27:04.991 "claimed": true, 00:27:04.991 "claim_type": "exclusive_write", 00:27:04.991 "zoned": false, 00:27:04.991 "supported_io_types": { 00:27:04.991 "read": true, 00:27:04.991 "write": true, 00:27:04.991 "unmap": true, 00:27:04.991 "flush": true, 00:27:04.991 "reset": true, 00:27:04.991 "nvme_admin": false, 00:27:04.991 "nvme_io": false, 00:27:04.991 "nvme_io_md": false, 00:27:04.991 "write_zeroes": true, 00:27:04.991 "zcopy": true, 00:27:04.991 "get_zone_info": false, 00:27:04.991 "zone_management": false, 00:27:04.991 "zone_append": false, 00:27:04.991 "compare": false, 00:27:04.991 "compare_and_write": false, 00:27:04.991 "abort": true, 00:27:04.991 "seek_hole": false, 00:27:04.991 "seek_data": false, 00:27:04.991 "copy": true, 00:27:04.991 "nvme_iov_md": false 00:27:04.991 }, 00:27:04.991 "memory_domains": [ 00:27:04.991 { 00:27:04.991 "dma_device_id": "system", 00:27:04.991 "dma_device_type": 1 00:27:04.991 }, 00:27:04.991 { 00:27:04.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:04.991 "dma_device_type": 2 00:27:04.991 } 00:27:04.991 ], 00:27:04.991 "driver_specific": { 00:27:04.991 "passthru": { 00:27:04.991 "name": "pt2", 00:27:04.991 "base_bdev_name": "malloc2" 00:27:04.991 } 00:27:04.991 } 00:27:04.991 }' 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:04.991 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:27:05.250 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:05.509 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:05.509 "name": "pt3", 00:27:05.509 "aliases": [ 00:27:05.509 "00000000-0000-0000-0000-000000000003" 00:27:05.509 ], 00:27:05.509 "product_name": "passthru", 00:27:05.509 "block_size": 512, 00:27:05.509 "num_blocks": 65536, 00:27:05.509 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:05.509 "assigned_rate_limits": { 00:27:05.509 "rw_ios_per_sec": 0, 00:27:05.509 "rw_mbytes_per_sec": 0, 00:27:05.509 "r_mbytes_per_sec": 0, 00:27:05.509 "w_mbytes_per_sec": 0 00:27:05.509 }, 00:27:05.509 "claimed": true, 00:27:05.509 "claim_type": "exclusive_write", 00:27:05.509 "zoned": false, 00:27:05.509 "supported_io_types": { 00:27:05.509 "read": true, 00:27:05.509 "write": true, 00:27:05.509 "unmap": true, 00:27:05.509 "flush": true, 00:27:05.509 "reset": true, 00:27:05.509 "nvme_admin": false, 00:27:05.509 "nvme_io": false, 00:27:05.509 "nvme_io_md": false, 00:27:05.509 "write_zeroes": true, 00:27:05.509 "zcopy": true, 00:27:05.509 "get_zone_info": false, 00:27:05.509 "zone_management": false, 00:27:05.509 "zone_append": false, 00:27:05.509 "compare": false, 00:27:05.509 "compare_and_write": false, 00:27:05.509 "abort": true, 00:27:05.509 "seek_hole": false, 00:27:05.509 "seek_data": false, 00:27:05.509 "copy": true, 00:27:05.509 "nvme_iov_md": false 00:27:05.509 }, 00:27:05.509 "memory_domains": [ 00:27:05.509 { 00:27:05.509 "dma_device_id": "system", 00:27:05.509 "dma_device_type": 1 00:27:05.509 }, 00:27:05.509 { 00:27:05.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:05.509 "dma_device_type": 2 00:27:05.509 } 00:27:05.509 ], 00:27:05.509 "driver_specific": { 00:27:05.509 "passthru": { 00:27:05.509 "name": "pt3", 00:27:05.509 "base_bdev_name": "malloc3" 00:27:05.509 } 00:27:05.509 } 00:27:05.509 }' 00:27:05.509 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:05.509 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:05.509 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:05.509 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:05.767 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:05.767 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:05.767 17:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:05.767 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:05.767 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:05.768 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:05.768 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:05.768 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:05.768 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:05.768 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:27:05.768 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:06.026 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:06.026 "name": "pt4", 00:27:06.026 "aliases": [ 00:27:06.026 "00000000-0000-0000-0000-000000000004" 00:27:06.026 ], 00:27:06.026 "product_name": "passthru", 00:27:06.026 "block_size": 512, 00:27:06.026 "num_blocks": 65536, 00:27:06.026 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:06.026 "assigned_rate_limits": { 00:27:06.026 "rw_ios_per_sec": 0, 00:27:06.026 "rw_mbytes_per_sec": 0, 00:27:06.026 "r_mbytes_per_sec": 0, 00:27:06.026 "w_mbytes_per_sec": 0 00:27:06.026 }, 00:27:06.026 "claimed": true, 00:27:06.026 "claim_type": "exclusive_write", 00:27:06.026 "zoned": false, 00:27:06.026 "supported_io_types": { 00:27:06.026 "read": true, 00:27:06.026 "write": true, 00:27:06.026 "unmap": true, 00:27:06.026 "flush": true, 00:27:06.026 "reset": true, 00:27:06.026 "nvme_admin": false, 00:27:06.026 "nvme_io": false, 00:27:06.026 "nvme_io_md": false, 00:27:06.026 "write_zeroes": true, 00:27:06.026 "zcopy": true, 00:27:06.026 "get_zone_info": false, 00:27:06.026 "zone_management": false, 00:27:06.026 "zone_append": false, 00:27:06.026 "compare": false, 00:27:06.026 "compare_and_write": false, 00:27:06.026 "abort": true, 00:27:06.026 "seek_hole": false, 00:27:06.026 "seek_data": false, 00:27:06.026 "copy": true, 00:27:06.026 "nvme_iov_md": false 00:27:06.026 }, 00:27:06.026 "memory_domains": [ 00:27:06.026 { 00:27:06.026 "dma_device_id": "system", 00:27:06.026 "dma_device_type": 1 00:27:06.026 }, 00:27:06.026 { 00:27:06.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:06.026 "dma_device_type": 2 00:27:06.026 } 00:27:06.026 ], 00:27:06.026 "driver_specific": { 00:27:06.026 "passthru": { 00:27:06.026 "name": "pt4", 00:27:06.026 "base_bdev_name": "malloc4" 00:27:06.026 } 00:27:06.026 } 00:27:06.026 }' 00:27:06.026 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.026 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.285 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:06.544 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:06.544 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:06.544 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:06.544 [2024-07-23 17:21:01.959553] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:06.803 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=6c721821-2f36-4b7a-ab5c-c9e13ee1db44 00:27:06.803 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 6c721821-2f36-4b7a-ab5c-c9e13ee1db44 ']' 00:27:06.803 17:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:06.803 [2024-07-23 17:21:02.203883] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:06.803 [2024-07-23 17:21:02.203913] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:06.803 [2024-07-23 17:21:02.203960] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:06.803 [2024-07-23 17:21:02.204043] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:06.803 [2024-07-23 17:21:02.204055] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b69e80 name raid_bdev1, state offline 00:27:07.062 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.062 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:07.062 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:07.062 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:07.062 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.062 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:07.321 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.321 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:07.579 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.579 17:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:27:07.838 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:07.838 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:27:08.096 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:08.096 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:08.354 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:08.354 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:27:08.354 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:27:08.354 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:27:08.354 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:08.355 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:27:08.613 [2024-07-23 17:21:03.912321] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:08.613 [2024-07-23 17:21:03.913657] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:08.613 [2024-07-23 17:21:03.913699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:27:08.613 [2024-07-23 17:21:03.913732] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:27:08.613 [2024-07-23 17:21:03.913775] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:08.613 [2024-07-23 17:21:03.913813] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:08.613 [2024-07-23 17:21:03.913835] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:27:08.613 [2024-07-23 17:21:03.913857] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:27:08.613 [2024-07-23 17:21:03.913874] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:08.613 [2024-07-23 17:21:03.913884] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a54090 name raid_bdev1, state configuring 00:27:08.613 request: 00:27:08.613 { 00:27:08.613 "name": "raid_bdev1", 00:27:08.613 "raid_level": "raid1", 00:27:08.613 "base_bdevs": [ 00:27:08.613 "malloc1", 00:27:08.613 "malloc2", 00:27:08.613 "malloc3", 00:27:08.613 "malloc4" 00:27:08.613 ], 00:27:08.613 "superblock": false, 00:27:08.613 "method": "bdev_raid_create", 00:27:08.613 "req_id": 1 00:27:08.613 } 00:27:08.613 Got JSON-RPC error response 00:27:08.613 response: 00:27:08.613 { 00:27:08.613 "code": -17, 00:27:08.613 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:08.613 } 00:27:08.613 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:27:08.613 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:08.613 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:08.613 17:21:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:08.613 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.613 17:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:08.872 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:08.872 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:08.872 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:09.130 [2024-07-23 17:21:04.405559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:09.130 [2024-07-23 17:21:04.405598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:09.130 [2024-07-23 17:21:04.405614] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6aed0 00:27:09.130 [2024-07-23 17:21:04.405626] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:09.130 [2024-07-23 17:21:04.407179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:09.130 [2024-07-23 17:21:04.407207] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:09.130 [2024-07-23 17:21:04.407269] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:09.130 [2024-07-23 17:21:04.407295] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:09.130 pt1 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.130 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.389 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.389 "name": "raid_bdev1", 00:27:09.389 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:09.389 "strip_size_kb": 0, 00:27:09.389 "state": "configuring", 00:27:09.389 "raid_level": "raid1", 00:27:09.389 "superblock": true, 00:27:09.389 "num_base_bdevs": 4, 00:27:09.389 "num_base_bdevs_discovered": 1, 00:27:09.389 "num_base_bdevs_operational": 4, 00:27:09.389 "base_bdevs_list": [ 00:27:09.389 { 00:27:09.389 "name": "pt1", 00:27:09.389 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:09.389 "is_configured": true, 00:27:09.389 "data_offset": 2048, 00:27:09.389 "data_size": 63488 00:27:09.389 }, 00:27:09.389 { 00:27:09.389 "name": null, 00:27:09.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:09.389 "is_configured": false, 00:27:09.389 "data_offset": 2048, 00:27:09.389 "data_size": 63488 00:27:09.389 }, 00:27:09.389 { 00:27:09.389 "name": null, 00:27:09.389 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:09.389 "is_configured": false, 00:27:09.389 "data_offset": 2048, 00:27:09.389 "data_size": 63488 00:27:09.389 }, 00:27:09.389 { 00:27:09.389 "name": null, 00:27:09.389 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:09.389 "is_configured": false, 00:27:09.389 "data_offset": 2048, 00:27:09.389 "data_size": 63488 00:27:09.389 } 00:27:09.389 ] 00:27:09.389 }' 00:27:09.389 17:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.389 17:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:09.955 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:27:09.955 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:10.213 [2024-07-23 17:21:05.500587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:10.213 [2024-07-23 17:21:05.500637] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:10.213 [2024-07-23 17:21:05.500656] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6b560 00:27:10.213 [2024-07-23 17:21:05.500668] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:10.213 [2024-07-23 17:21:05.501004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:10.213 [2024-07-23 17:21:05.501022] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:10.213 [2024-07-23 17:21:05.501085] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:10.213 [2024-07-23 17:21:05.501104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:10.213 pt2 00:27:10.213 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:10.472 [2024-07-23 17:21:05.753268] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.472 17:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.731 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.731 "name": "raid_bdev1", 00:27:10.731 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:10.731 "strip_size_kb": 0, 00:27:10.731 "state": "configuring", 00:27:10.731 "raid_level": "raid1", 00:27:10.731 "superblock": true, 00:27:10.731 "num_base_bdevs": 4, 00:27:10.731 "num_base_bdevs_discovered": 1, 00:27:10.731 "num_base_bdevs_operational": 4, 00:27:10.731 "base_bdevs_list": [ 00:27:10.731 { 00:27:10.731 "name": "pt1", 00:27:10.731 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:10.731 "is_configured": true, 00:27:10.731 "data_offset": 2048, 00:27:10.731 "data_size": 63488 00:27:10.731 }, 00:27:10.731 { 00:27:10.731 "name": null, 00:27:10.731 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:10.731 "is_configured": false, 00:27:10.731 "data_offset": 2048, 00:27:10.731 "data_size": 63488 00:27:10.731 }, 00:27:10.731 { 00:27:10.731 "name": null, 00:27:10.731 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:10.731 "is_configured": false, 00:27:10.731 "data_offset": 2048, 00:27:10.731 "data_size": 63488 00:27:10.731 }, 00:27:10.731 { 00:27:10.731 "name": null, 00:27:10.731 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:10.731 "is_configured": false, 00:27:10.731 "data_offset": 2048, 00:27:10.731 "data_size": 63488 00:27:10.731 } 00:27:10.731 ] 00:27:10.731 }' 00:27:10.731 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.731 17:21:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:11.297 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:11.297 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:11.297 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:11.556 [2024-07-23 17:21:06.840155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:11.556 [2024-07-23 17:21:06.840203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:11.556 [2024-07-23 17:21:06.840223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6a310 00:27:11.556 [2024-07-23 17:21:06.840236] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:11.556 [2024-07-23 17:21:06.840564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:11.556 [2024-07-23 17:21:06.840581] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:11.556 [2024-07-23 17:21:06.840642] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:11.556 [2024-07-23 17:21:06.840660] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:11.556 pt2 00:27:11.556 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:11.556 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:11.556 17:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:27:11.815 [2024-07-23 17:21:07.084802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:27:11.815 [2024-07-23 17:21:07.084838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:11.815 [2024-07-23 17:21:07.084857] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bcca0 00:27:11.815 [2024-07-23 17:21:07.084869] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:11.815 [2024-07-23 17:21:07.085179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:11.815 [2024-07-23 17:21:07.085197] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:27:11.816 [2024-07-23 17:21:07.085250] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:27:11.816 [2024-07-23 17:21:07.085273] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:27:11.816 pt3 00:27:11.816 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:11.816 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:11.816 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:27:12.076 [2024-07-23 17:21:07.329461] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:27:12.076 [2024-07-23 17:21:07.329493] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:12.076 [2024-07-23 17:21:07.329512] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bd3f0 00:27:12.076 [2024-07-23 17:21:07.329524] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:12.076 [2024-07-23 17:21:07.329826] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:12.076 [2024-07-23 17:21:07.329843] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:27:12.076 [2024-07-23 17:21:07.329902] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:27:12.076 [2024-07-23 17:21:07.329920] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:27:12.076 [2024-07-23 17:21:07.330038] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b6da90 00:27:12.076 [2024-07-23 17:21:07.330048] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:12.076 [2024-07-23 17:21:07.330218] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b6cd40 00:27:12.076 [2024-07-23 17:21:07.330349] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b6da90 00:27:12.076 [2024-07-23 17:21:07.330359] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b6da90 00:27:12.076 [2024-07-23 17:21:07.330457] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:12.076 pt4 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.076 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.340 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.340 "name": "raid_bdev1", 00:27:12.340 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:12.340 "strip_size_kb": 0, 00:27:12.340 "state": "online", 00:27:12.340 "raid_level": "raid1", 00:27:12.340 "superblock": true, 00:27:12.340 "num_base_bdevs": 4, 00:27:12.340 "num_base_bdevs_discovered": 4, 00:27:12.340 "num_base_bdevs_operational": 4, 00:27:12.340 "base_bdevs_list": [ 00:27:12.340 { 00:27:12.340 "name": "pt1", 00:27:12.340 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:12.340 "is_configured": true, 00:27:12.340 "data_offset": 2048, 00:27:12.340 "data_size": 63488 00:27:12.340 }, 00:27:12.340 { 00:27:12.340 "name": "pt2", 00:27:12.340 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:12.340 "is_configured": true, 00:27:12.340 "data_offset": 2048, 00:27:12.340 "data_size": 63488 00:27:12.340 }, 00:27:12.340 { 00:27:12.340 "name": "pt3", 00:27:12.340 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:12.340 "is_configured": true, 00:27:12.340 "data_offset": 2048, 00:27:12.340 "data_size": 63488 00:27:12.340 }, 00:27:12.340 { 00:27:12.340 "name": "pt4", 00:27:12.340 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:12.340 "is_configured": true, 00:27:12.340 "data_offset": 2048, 00:27:12.340 "data_size": 63488 00:27:12.340 } 00:27:12.340 ] 00:27:12.340 }' 00:27:12.340 17:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.340 17:21:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:12.907 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:13.166 [2024-07-23 17:21:08.364511] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:13.166 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:13.166 "name": "raid_bdev1", 00:27:13.166 "aliases": [ 00:27:13.166 "6c721821-2f36-4b7a-ab5c-c9e13ee1db44" 00:27:13.166 ], 00:27:13.166 "product_name": "Raid Volume", 00:27:13.166 "block_size": 512, 00:27:13.166 "num_blocks": 63488, 00:27:13.166 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:13.166 "assigned_rate_limits": { 00:27:13.166 "rw_ios_per_sec": 0, 00:27:13.166 "rw_mbytes_per_sec": 0, 00:27:13.166 "r_mbytes_per_sec": 0, 00:27:13.167 "w_mbytes_per_sec": 0 00:27:13.167 }, 00:27:13.167 "claimed": false, 00:27:13.167 "zoned": false, 00:27:13.167 "supported_io_types": { 00:27:13.167 "read": true, 00:27:13.167 "write": true, 00:27:13.167 "unmap": false, 00:27:13.167 "flush": false, 00:27:13.167 "reset": true, 00:27:13.167 "nvme_admin": false, 00:27:13.167 "nvme_io": false, 00:27:13.167 "nvme_io_md": false, 00:27:13.167 "write_zeroes": true, 00:27:13.167 "zcopy": false, 00:27:13.167 "get_zone_info": false, 00:27:13.167 "zone_management": false, 00:27:13.167 "zone_append": false, 00:27:13.167 "compare": false, 00:27:13.167 "compare_and_write": false, 00:27:13.167 "abort": false, 00:27:13.167 "seek_hole": false, 00:27:13.167 "seek_data": false, 00:27:13.167 "copy": false, 00:27:13.167 "nvme_iov_md": false 00:27:13.167 }, 00:27:13.167 "memory_domains": [ 00:27:13.167 { 00:27:13.167 "dma_device_id": "system", 00:27:13.167 "dma_device_type": 1 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.167 "dma_device_type": 2 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "system", 00:27:13.167 "dma_device_type": 1 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.167 "dma_device_type": 2 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "system", 00:27:13.167 "dma_device_type": 1 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.167 "dma_device_type": 2 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "system", 00:27:13.167 "dma_device_type": 1 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.167 "dma_device_type": 2 00:27:13.167 } 00:27:13.167 ], 00:27:13.167 "driver_specific": { 00:27:13.167 "raid": { 00:27:13.167 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:13.167 "strip_size_kb": 0, 00:27:13.167 "state": "online", 00:27:13.167 "raid_level": "raid1", 00:27:13.167 "superblock": true, 00:27:13.167 "num_base_bdevs": 4, 00:27:13.167 "num_base_bdevs_discovered": 4, 00:27:13.167 "num_base_bdevs_operational": 4, 00:27:13.167 "base_bdevs_list": [ 00:27:13.167 { 00:27:13.167 "name": "pt1", 00:27:13.167 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:13.167 "is_configured": true, 00:27:13.167 "data_offset": 2048, 00:27:13.167 "data_size": 63488 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "name": "pt2", 00:27:13.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:13.167 "is_configured": true, 00:27:13.167 "data_offset": 2048, 00:27:13.167 "data_size": 63488 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "name": "pt3", 00:27:13.167 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:13.167 "is_configured": true, 00:27:13.167 "data_offset": 2048, 00:27:13.167 "data_size": 63488 00:27:13.167 }, 00:27:13.167 { 00:27:13.167 "name": "pt4", 00:27:13.167 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:13.167 "is_configured": true, 00:27:13.167 "data_offset": 2048, 00:27:13.167 "data_size": 63488 00:27:13.167 } 00:27:13.167 ] 00:27:13.167 } 00:27:13.167 } 00:27:13.167 }' 00:27:13.167 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:13.167 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:13.167 pt2 00:27:13.167 pt3 00:27:13.167 pt4' 00:27:13.167 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:13.167 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:13.167 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:13.426 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:13.426 "name": "pt1", 00:27:13.426 "aliases": [ 00:27:13.426 "00000000-0000-0000-0000-000000000001" 00:27:13.426 ], 00:27:13.426 "product_name": "passthru", 00:27:13.426 "block_size": 512, 00:27:13.426 "num_blocks": 65536, 00:27:13.426 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:13.426 "assigned_rate_limits": { 00:27:13.426 "rw_ios_per_sec": 0, 00:27:13.427 "rw_mbytes_per_sec": 0, 00:27:13.427 "r_mbytes_per_sec": 0, 00:27:13.427 "w_mbytes_per_sec": 0 00:27:13.427 }, 00:27:13.427 "claimed": true, 00:27:13.427 "claim_type": "exclusive_write", 00:27:13.427 "zoned": false, 00:27:13.427 "supported_io_types": { 00:27:13.427 "read": true, 00:27:13.427 "write": true, 00:27:13.427 "unmap": true, 00:27:13.427 "flush": true, 00:27:13.427 "reset": true, 00:27:13.427 "nvme_admin": false, 00:27:13.427 "nvme_io": false, 00:27:13.427 "nvme_io_md": false, 00:27:13.427 "write_zeroes": true, 00:27:13.427 "zcopy": true, 00:27:13.427 "get_zone_info": false, 00:27:13.427 "zone_management": false, 00:27:13.427 "zone_append": false, 00:27:13.427 "compare": false, 00:27:13.427 "compare_and_write": false, 00:27:13.427 "abort": true, 00:27:13.427 "seek_hole": false, 00:27:13.427 "seek_data": false, 00:27:13.427 "copy": true, 00:27:13.427 "nvme_iov_md": false 00:27:13.427 }, 00:27:13.427 "memory_domains": [ 00:27:13.427 { 00:27:13.427 "dma_device_id": "system", 00:27:13.427 "dma_device_type": 1 00:27:13.427 }, 00:27:13.427 { 00:27:13.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.427 "dma_device_type": 2 00:27:13.427 } 00:27:13.427 ], 00:27:13.427 "driver_specific": { 00:27:13.427 "passthru": { 00:27:13.427 "name": "pt1", 00:27:13.427 "base_bdev_name": "malloc1" 00:27:13.427 } 00:27:13.427 } 00:27:13.427 }' 00:27:13.427 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.427 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.427 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:13.427 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:13.427 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:13.686 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:13.686 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:13.686 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:13.686 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:13.686 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:13.686 17:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:13.686 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:13.686 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:13.686 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:13.686 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:13.945 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:13.945 "name": "pt2", 00:27:13.945 "aliases": [ 00:27:13.945 "00000000-0000-0000-0000-000000000002" 00:27:13.945 ], 00:27:13.945 "product_name": "passthru", 00:27:13.945 "block_size": 512, 00:27:13.945 "num_blocks": 65536, 00:27:13.945 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:13.945 "assigned_rate_limits": { 00:27:13.945 "rw_ios_per_sec": 0, 00:27:13.945 "rw_mbytes_per_sec": 0, 00:27:13.945 "r_mbytes_per_sec": 0, 00:27:13.945 "w_mbytes_per_sec": 0 00:27:13.945 }, 00:27:13.945 "claimed": true, 00:27:13.945 "claim_type": "exclusive_write", 00:27:13.945 "zoned": false, 00:27:13.945 "supported_io_types": { 00:27:13.945 "read": true, 00:27:13.945 "write": true, 00:27:13.945 "unmap": true, 00:27:13.945 "flush": true, 00:27:13.945 "reset": true, 00:27:13.945 "nvme_admin": false, 00:27:13.945 "nvme_io": false, 00:27:13.945 "nvme_io_md": false, 00:27:13.945 "write_zeroes": true, 00:27:13.945 "zcopy": true, 00:27:13.945 "get_zone_info": false, 00:27:13.945 "zone_management": false, 00:27:13.945 "zone_append": false, 00:27:13.945 "compare": false, 00:27:13.945 "compare_and_write": false, 00:27:13.945 "abort": true, 00:27:13.945 "seek_hole": false, 00:27:13.945 "seek_data": false, 00:27:13.945 "copy": true, 00:27:13.945 "nvme_iov_md": false 00:27:13.945 }, 00:27:13.945 "memory_domains": [ 00:27:13.945 { 00:27:13.945 "dma_device_id": "system", 00:27:13.945 "dma_device_type": 1 00:27:13.945 }, 00:27:13.945 { 00:27:13.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.945 "dma_device_type": 2 00:27:13.945 } 00:27:13.945 ], 00:27:13.945 "driver_specific": { 00:27:13.945 "passthru": { 00:27:13.945 "name": "pt2", 00:27:13.946 "base_bdev_name": "malloc2" 00:27:13.946 } 00:27:13.946 } 00:27:13.946 }' 00:27:13.946 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.946 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:14.205 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:14.463 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:14.464 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:14.464 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:27:14.464 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:14.464 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:14.464 "name": "pt3", 00:27:14.464 "aliases": [ 00:27:14.464 "00000000-0000-0000-0000-000000000003" 00:27:14.464 ], 00:27:14.464 "product_name": "passthru", 00:27:14.464 "block_size": 512, 00:27:14.464 "num_blocks": 65536, 00:27:14.464 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:14.464 "assigned_rate_limits": { 00:27:14.464 "rw_ios_per_sec": 0, 00:27:14.464 "rw_mbytes_per_sec": 0, 00:27:14.464 "r_mbytes_per_sec": 0, 00:27:14.464 "w_mbytes_per_sec": 0 00:27:14.464 }, 00:27:14.464 "claimed": true, 00:27:14.464 "claim_type": "exclusive_write", 00:27:14.464 "zoned": false, 00:27:14.464 "supported_io_types": { 00:27:14.464 "read": true, 00:27:14.464 "write": true, 00:27:14.464 "unmap": true, 00:27:14.464 "flush": true, 00:27:14.464 "reset": true, 00:27:14.464 "nvme_admin": false, 00:27:14.464 "nvme_io": false, 00:27:14.464 "nvme_io_md": false, 00:27:14.464 "write_zeroes": true, 00:27:14.464 "zcopy": true, 00:27:14.464 "get_zone_info": false, 00:27:14.464 "zone_management": false, 00:27:14.464 "zone_append": false, 00:27:14.464 "compare": false, 00:27:14.464 "compare_and_write": false, 00:27:14.464 "abort": true, 00:27:14.464 "seek_hole": false, 00:27:14.464 "seek_data": false, 00:27:14.464 "copy": true, 00:27:14.464 "nvme_iov_md": false 00:27:14.464 }, 00:27:14.464 "memory_domains": [ 00:27:14.464 { 00:27:14.464 "dma_device_id": "system", 00:27:14.464 "dma_device_type": 1 00:27:14.464 }, 00:27:14.464 { 00:27:14.464 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:14.464 "dma_device_type": 2 00:27:14.464 } 00:27:14.464 ], 00:27:14.464 "driver_specific": { 00:27:14.464 "passthru": { 00:27:14.464 "name": "pt3", 00:27:14.464 "base_bdev_name": "malloc3" 00:27:14.464 } 00:27:14.464 } 00:27:14.464 }' 00:27:14.464 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:14.722 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:14.722 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:14.722 17:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.722 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.722 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:14.722 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.722 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.722 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:14.722 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:14.980 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:14.980 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:14.980 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:14.980 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:27:14.980 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:15.239 "name": "pt4", 00:27:15.239 "aliases": [ 00:27:15.239 "00000000-0000-0000-0000-000000000004" 00:27:15.239 ], 00:27:15.239 "product_name": "passthru", 00:27:15.239 "block_size": 512, 00:27:15.239 "num_blocks": 65536, 00:27:15.239 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:15.239 "assigned_rate_limits": { 00:27:15.239 "rw_ios_per_sec": 0, 00:27:15.239 "rw_mbytes_per_sec": 0, 00:27:15.239 "r_mbytes_per_sec": 0, 00:27:15.239 "w_mbytes_per_sec": 0 00:27:15.239 }, 00:27:15.239 "claimed": true, 00:27:15.239 "claim_type": "exclusive_write", 00:27:15.239 "zoned": false, 00:27:15.239 "supported_io_types": { 00:27:15.239 "read": true, 00:27:15.239 "write": true, 00:27:15.239 "unmap": true, 00:27:15.239 "flush": true, 00:27:15.239 "reset": true, 00:27:15.239 "nvme_admin": false, 00:27:15.239 "nvme_io": false, 00:27:15.239 "nvme_io_md": false, 00:27:15.239 "write_zeroes": true, 00:27:15.239 "zcopy": true, 00:27:15.239 "get_zone_info": false, 00:27:15.239 "zone_management": false, 00:27:15.239 "zone_append": false, 00:27:15.239 "compare": false, 00:27:15.239 "compare_and_write": false, 00:27:15.239 "abort": true, 00:27:15.239 "seek_hole": false, 00:27:15.239 "seek_data": false, 00:27:15.239 "copy": true, 00:27:15.239 "nvme_iov_md": false 00:27:15.239 }, 00:27:15.239 "memory_domains": [ 00:27:15.239 { 00:27:15.239 "dma_device_id": "system", 00:27:15.239 "dma_device_type": 1 00:27:15.239 }, 00:27:15.239 { 00:27:15.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.239 "dma_device_type": 2 00:27:15.239 } 00:27:15.239 ], 00:27:15.239 "driver_specific": { 00:27:15.239 "passthru": { 00:27:15.239 "name": "pt4", 00:27:15.239 "base_bdev_name": "malloc4" 00:27:15.239 } 00:27:15.239 } 00:27:15.239 }' 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:15.239 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:15.498 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:15.757 [2024-07-23 17:21:10.975427] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:15.757 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 6c721821-2f36-4b7a-ab5c-c9e13ee1db44 '!=' 6c721821-2f36-4b7a-ab5c-c9e13ee1db44 ']' 00:27:15.757 17:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:15.757 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:15.757 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:27:15.757 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:16.325 [2024-07-23 17:21:11.484513] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.325 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.585 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.585 "name": "raid_bdev1", 00:27:16.585 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:16.585 "strip_size_kb": 0, 00:27:16.585 "state": "online", 00:27:16.585 "raid_level": "raid1", 00:27:16.585 "superblock": true, 00:27:16.585 "num_base_bdevs": 4, 00:27:16.585 "num_base_bdevs_discovered": 3, 00:27:16.585 "num_base_bdevs_operational": 3, 00:27:16.585 "base_bdevs_list": [ 00:27:16.585 { 00:27:16.585 "name": null, 00:27:16.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.585 "is_configured": false, 00:27:16.585 "data_offset": 2048, 00:27:16.585 "data_size": 63488 00:27:16.585 }, 00:27:16.585 { 00:27:16.585 "name": "pt2", 00:27:16.585 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:16.585 "is_configured": true, 00:27:16.585 "data_offset": 2048, 00:27:16.585 "data_size": 63488 00:27:16.585 }, 00:27:16.585 { 00:27:16.585 "name": "pt3", 00:27:16.585 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:16.585 "is_configured": true, 00:27:16.585 "data_offset": 2048, 00:27:16.585 "data_size": 63488 00:27:16.585 }, 00:27:16.585 { 00:27:16.585 "name": "pt4", 00:27:16.585 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:16.585 "is_configured": true, 00:27:16.585 "data_offset": 2048, 00:27:16.585 "data_size": 63488 00:27:16.585 } 00:27:16.585 ] 00:27:16.585 }' 00:27:16.585 17:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.585 17:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:17.154 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:17.154 [2024-07-23 17:21:12.535268] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:17.154 [2024-07-23 17:21:12.535296] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:17.154 [2024-07-23 17:21:12.535350] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:17.154 [2024-07-23 17:21:12.535417] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:17.154 [2024-07-23 17:21:12.535428] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6da90 name raid_bdev1, state offline 00:27:17.154 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.154 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:17.413 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:17.413 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:17.413 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:17.413 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:17.413 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:17.687 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:17.687 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:17.687 17:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:17.973 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:18.232 [2024-07-23 17:21:13.525844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:18.232 [2024-07-23 17:21:13.525885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:18.232 [2024-07-23 17:21:13.525909] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bced0 00:27:18.232 [2024-07-23 17:21:13.525921] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:18.232 [2024-07-23 17:21:13.527477] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:18.232 [2024-07-23 17:21:13.527503] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:18.232 [2024-07-23 17:21:13.527567] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:18.232 [2024-07-23 17:21:13.527591] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:18.232 pt2 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.232 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.491 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.491 "name": "raid_bdev1", 00:27:18.491 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:18.491 "strip_size_kb": 0, 00:27:18.491 "state": "configuring", 00:27:18.491 "raid_level": "raid1", 00:27:18.491 "superblock": true, 00:27:18.491 "num_base_bdevs": 4, 00:27:18.491 "num_base_bdevs_discovered": 1, 00:27:18.491 "num_base_bdevs_operational": 3, 00:27:18.491 "base_bdevs_list": [ 00:27:18.491 { 00:27:18.491 "name": null, 00:27:18.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.491 "is_configured": false, 00:27:18.491 "data_offset": 2048, 00:27:18.491 "data_size": 63488 00:27:18.491 }, 00:27:18.491 { 00:27:18.491 "name": "pt2", 00:27:18.491 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:18.491 "is_configured": true, 00:27:18.491 "data_offset": 2048, 00:27:18.491 "data_size": 63488 00:27:18.491 }, 00:27:18.491 { 00:27:18.491 "name": null, 00:27:18.491 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:18.491 "is_configured": false, 00:27:18.491 "data_offset": 2048, 00:27:18.491 "data_size": 63488 00:27:18.491 }, 00:27:18.491 { 00:27:18.491 "name": null, 00:27:18.491 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:18.491 "is_configured": false, 00:27:18.491 "data_offset": 2048, 00:27:18.491 "data_size": 63488 00:27:18.491 } 00:27:18.491 ] 00:27:18.491 }' 00:27:18.491 17:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.491 17:21:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:19.059 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:27:19.059 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:19.059 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:27:19.318 [2024-07-23 17:21:14.560643] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:27:19.318 [2024-07-23 17:21:14.560686] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.318 [2024-07-23 17:21:14.560704] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6f1a0 00:27:19.318 [2024-07-23 17:21:14.560716] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.318 [2024-07-23 17:21:14.561044] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.318 [2024-07-23 17:21:14.561061] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:27:19.318 [2024-07-23 17:21:14.561120] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:27:19.318 [2024-07-23 17:21:14.561138] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:27:19.318 pt3 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.318 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.576 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:19.576 "name": "raid_bdev1", 00:27:19.576 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:19.576 "strip_size_kb": 0, 00:27:19.576 "state": "configuring", 00:27:19.576 "raid_level": "raid1", 00:27:19.576 "superblock": true, 00:27:19.576 "num_base_bdevs": 4, 00:27:19.576 "num_base_bdevs_discovered": 2, 00:27:19.576 "num_base_bdevs_operational": 3, 00:27:19.576 "base_bdevs_list": [ 00:27:19.576 { 00:27:19.576 "name": null, 00:27:19.576 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.576 "is_configured": false, 00:27:19.576 "data_offset": 2048, 00:27:19.576 "data_size": 63488 00:27:19.576 }, 00:27:19.576 { 00:27:19.576 "name": "pt2", 00:27:19.576 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:19.576 "is_configured": true, 00:27:19.576 "data_offset": 2048, 00:27:19.576 "data_size": 63488 00:27:19.576 }, 00:27:19.576 { 00:27:19.576 "name": "pt3", 00:27:19.576 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:19.576 "is_configured": true, 00:27:19.576 "data_offset": 2048, 00:27:19.576 "data_size": 63488 00:27:19.576 }, 00:27:19.576 { 00:27:19.576 "name": null, 00:27:19.576 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:19.576 "is_configured": false, 00:27:19.576 "data_offset": 2048, 00:27:19.576 "data_size": 63488 00:27:19.576 } 00:27:19.576 ] 00:27:19.576 }' 00:27:19.576 17:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:19.576 17:21:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:27:20.143 [2024-07-23 17:21:15.507158] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:27:20.143 [2024-07-23 17:21:15.507204] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.143 [2024-07-23 17:21:15.507221] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b694b0 00:27:20.143 [2024-07-23 17:21:15.507234] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.143 [2024-07-23 17:21:15.507555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.143 [2024-07-23 17:21:15.507572] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:27:20.143 [2024-07-23 17:21:15.507629] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:27:20.143 [2024-07-23 17:21:15.507647] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:27:20.143 [2024-07-23 17:21:15.507754] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b6a940 00:27:20.143 [2024-07-23 17:21:15.507764] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:20.143 [2024-07-23 17:21:15.507939] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b66670 00:27:20.143 [2024-07-23 17:21:15.508069] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b6a940 00:27:20.143 [2024-07-23 17:21:15.508078] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b6a940 00:27:20.143 [2024-07-23 17:21:15.508171] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:20.143 pt4 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.143 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.401 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.401 "name": "raid_bdev1", 00:27:20.401 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:20.401 "strip_size_kb": 0, 00:27:20.401 "state": "online", 00:27:20.401 "raid_level": "raid1", 00:27:20.401 "superblock": true, 00:27:20.401 "num_base_bdevs": 4, 00:27:20.401 "num_base_bdevs_discovered": 3, 00:27:20.401 "num_base_bdevs_operational": 3, 00:27:20.401 "base_bdevs_list": [ 00:27:20.401 { 00:27:20.401 "name": null, 00:27:20.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.401 "is_configured": false, 00:27:20.401 "data_offset": 2048, 00:27:20.401 "data_size": 63488 00:27:20.401 }, 00:27:20.401 { 00:27:20.401 "name": "pt2", 00:27:20.401 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:20.401 "is_configured": true, 00:27:20.401 "data_offset": 2048, 00:27:20.401 "data_size": 63488 00:27:20.401 }, 00:27:20.401 { 00:27:20.401 "name": "pt3", 00:27:20.401 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:20.401 "is_configured": true, 00:27:20.401 "data_offset": 2048, 00:27:20.401 "data_size": 63488 00:27:20.401 }, 00:27:20.401 { 00:27:20.401 "name": "pt4", 00:27:20.401 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:20.401 "is_configured": true, 00:27:20.401 "data_offset": 2048, 00:27:20.401 "data_size": 63488 00:27:20.401 } 00:27:20.401 ] 00:27:20.401 }' 00:27:20.401 17:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.401 17:21:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:21.337 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:21.337 [2024-07-23 17:21:16.670242] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:21.337 [2024-07-23 17:21:16.670265] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:21.337 [2024-07-23 17:21:16.670315] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:21.337 [2024-07-23 17:21:16.670382] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:21.337 [2024-07-23 17:21:16.670394] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6a940 name raid_bdev1, state offline 00:27:21.337 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:21.337 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.596 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:21.596 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:21.596 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:27:21.596 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:27:21.596 17:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:27:21.856 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:22.115 [2024-07-23 17:21:17.396128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:22.115 [2024-07-23 17:21:17.396169] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:22.115 [2024-07-23 17:21:17.396186] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19baf40 00:27:22.115 [2024-07-23 17:21:17.396198] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:22.115 [2024-07-23 17:21:17.397811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:22.115 [2024-07-23 17:21:17.397839] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:22.115 [2024-07-23 17:21:17.397915] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:22.115 [2024-07-23 17:21:17.397941] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:22.115 [2024-07-23 17:21:17.398040] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:22.115 [2024-07-23 17:21:17.398053] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:22.115 [2024-07-23 17:21:17.398067] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6be30 name raid_bdev1, state configuring 00:27:22.115 [2024-07-23 17:21:17.398097] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:22.115 [2024-07-23 17:21:17.398172] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:27:22.115 pt1 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.115 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:22.375 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:22.375 "name": "raid_bdev1", 00:27:22.375 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:22.375 "strip_size_kb": 0, 00:27:22.375 "state": "configuring", 00:27:22.375 "raid_level": "raid1", 00:27:22.375 "superblock": true, 00:27:22.375 "num_base_bdevs": 4, 00:27:22.375 "num_base_bdevs_discovered": 2, 00:27:22.375 "num_base_bdevs_operational": 3, 00:27:22.375 "base_bdevs_list": [ 00:27:22.375 { 00:27:22.375 "name": null, 00:27:22.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:22.375 "is_configured": false, 00:27:22.375 "data_offset": 2048, 00:27:22.375 "data_size": 63488 00:27:22.375 }, 00:27:22.375 { 00:27:22.375 "name": "pt2", 00:27:22.375 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:22.375 "is_configured": true, 00:27:22.375 "data_offset": 2048, 00:27:22.375 "data_size": 63488 00:27:22.375 }, 00:27:22.375 { 00:27:22.375 "name": "pt3", 00:27:22.375 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:22.375 "is_configured": true, 00:27:22.375 "data_offset": 2048, 00:27:22.375 "data_size": 63488 00:27:22.375 }, 00:27:22.375 { 00:27:22.375 "name": null, 00:27:22.375 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:22.375 "is_configured": false, 00:27:22.375 "data_offset": 2048, 00:27:22.375 "data_size": 63488 00:27:22.375 } 00:27:22.375 ] 00:27:22.375 }' 00:27:22.375 17:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:22.375 17:21:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:22.944 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:27:22.944 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:23.203 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:27:23.203 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:27:23.462 [2024-07-23 17:21:18.811906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:27:23.462 [2024-07-23 17:21:18.811953] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:23.462 [2024-07-23 17:21:18.811972] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b67a20 00:27:23.462 [2024-07-23 17:21:18.811984] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:23.462 [2024-07-23 17:21:18.812306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:23.462 [2024-07-23 17:21:18.812324] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:27:23.462 [2024-07-23 17:21:18.812391] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:27:23.463 [2024-07-23 17:21:18.812410] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:27:23.463 [2024-07-23 17:21:18.812519] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b6c0b0 00:27:23.463 [2024-07-23 17:21:18.812529] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:23.463 [2024-07-23 17:21:18.812705] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19d35e0 00:27:23.463 [2024-07-23 17:21:18.812833] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b6c0b0 00:27:23.463 [2024-07-23 17:21:18.812842] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b6c0b0 00:27:23.463 [2024-07-23 17:21:18.812950] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:23.463 pt4 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.463 17:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.722 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.722 "name": "raid_bdev1", 00:27:23.722 "uuid": "6c721821-2f36-4b7a-ab5c-c9e13ee1db44", 00:27:23.722 "strip_size_kb": 0, 00:27:23.722 "state": "online", 00:27:23.722 "raid_level": "raid1", 00:27:23.722 "superblock": true, 00:27:23.722 "num_base_bdevs": 4, 00:27:23.722 "num_base_bdevs_discovered": 3, 00:27:23.722 "num_base_bdevs_operational": 3, 00:27:23.722 "base_bdevs_list": [ 00:27:23.722 { 00:27:23.722 "name": null, 00:27:23.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.723 "is_configured": false, 00:27:23.723 "data_offset": 2048, 00:27:23.723 "data_size": 63488 00:27:23.723 }, 00:27:23.723 { 00:27:23.723 "name": "pt2", 00:27:23.723 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:23.723 "is_configured": true, 00:27:23.723 "data_offset": 2048, 00:27:23.723 "data_size": 63488 00:27:23.723 }, 00:27:23.723 { 00:27:23.723 "name": "pt3", 00:27:23.723 "uuid": "00000000-0000-0000-0000-000000000003", 00:27:23.723 "is_configured": true, 00:27:23.723 "data_offset": 2048, 00:27:23.723 "data_size": 63488 00:27:23.723 }, 00:27:23.723 { 00:27:23.723 "name": "pt4", 00:27:23.723 "uuid": "00000000-0000-0000-0000-000000000004", 00:27:23.723 "is_configured": true, 00:27:23.723 "data_offset": 2048, 00:27:23.723 "data_size": 63488 00:27:23.723 } 00:27:23.723 ] 00:27:23.723 }' 00:27:23.723 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.723 17:21:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:24.661 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:24.661 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:24.661 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:24.661 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:24.661 17:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:24.919 [2024-07-23 17:21:20.272074] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 6c721821-2f36-4b7a-ab5c-c9e13ee1db44 '!=' 6c721821-2f36-4b7a-ab5c-c9e13ee1db44 ']' 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 20054 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 20054 ']' 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 20054 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:24.919 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 20054 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 20054' 00:27:25.177 killing process with pid 20054 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 20054 00:27:25.177 [2024-07-23 17:21:20.346370] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:25.177 [2024-07-23 17:21:20.346431] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:25.177 [2024-07-23 17:21:20.346504] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:25.177 [2024-07-23 17:21:20.346517] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6c0b0 name raid_bdev1, state offline 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 20054 00:27:25.177 [2024-07-23 17:21:20.389940] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:27:25.177 00:27:25.177 real 0m25.916s 00:27:25.177 user 0m47.392s 00:27:25.177 sys 0m4.727s 00:27:25.177 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:25.178 17:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:27:25.178 ************************************ 00:27:25.178 END TEST raid_superblock_test 00:27:25.178 ************************************ 00:27:25.435 17:21:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:25.435 17:21:20 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:27:25.436 17:21:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:25.436 17:21:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:25.436 17:21:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:25.436 ************************************ 00:27:25.436 START TEST raid_read_error_test 00:27:25.436 ************************************ 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.QkwudhhHg1 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=24427 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 24427 /var/tmp/spdk-raid.sock 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 24427 ']' 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:25.436 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:25.436 17:21:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:25.436 [2024-07-23 17:21:20.762231] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:27:25.436 [2024-07-23 17:21:20.762287] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid24427 ] 00:27:25.694 [2024-07-23 17:21:20.879515] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:25.694 [2024-07-23 17:21:20.930174] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:25.694 [2024-07-23 17:21:20.987623] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:25.695 [2024-07-23 17:21:20.987654] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:25.695 17:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:25.695 17:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:27:25.695 17:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:25.695 17:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:25.953 BaseBdev1_malloc 00:27:25.953 17:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:27:26.210 true 00:27:26.210 17:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:27:26.468 [2024-07-23 17:21:21.770964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:27:26.468 [2024-07-23 17:21:21.771012] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:26.468 [2024-07-23 17:21:21.771030] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f825c0 00:27:26.468 [2024-07-23 17:21:21.771043] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:26.468 [2024-07-23 17:21:21.772516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:26.468 [2024-07-23 17:21:21.772543] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:26.468 BaseBdev1 00:27:26.468 17:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:26.468 17:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:26.726 BaseBdev2_malloc 00:27:26.726 17:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:27:26.983 true 00:27:26.983 17:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:27:27.241 [2024-07-23 17:21:22.517595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:27:27.241 [2024-07-23 17:21:22.517640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.241 [2024-07-23 17:21:22.517660] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f7c620 00:27:27.241 [2024-07-23 17:21:22.517673] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.241 [2024-07-23 17:21:22.519044] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.241 [2024-07-23 17:21:22.519070] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:27.241 BaseBdev2 00:27:27.241 17:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:27.241 17:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:27.499 BaseBdev3_malloc 00:27:27.499 17:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:27:27.757 true 00:27:27.757 17:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:27:28.014 [2024-07-23 17:21:23.255956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:27:28.014 [2024-07-23 17:21:23.255997] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:28.014 [2024-07-23 17:21:23.256018] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f7cc00 00:27:28.014 [2024-07-23 17:21:23.256030] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:28.014 [2024-07-23 17:21:23.257400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:28.014 [2024-07-23 17:21:23.257426] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:28.014 BaseBdev3 00:27:28.014 17:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:28.014 17:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:28.271 BaseBdev4_malloc 00:27:28.271 17:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:27:28.528 true 00:27:28.528 17:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:27:28.785 [2024-07-23 17:21:24.006555] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:27:28.785 [2024-07-23 17:21:24.006595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:28.785 [2024-07-23 17:21:24.006615] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f7f9c0 00:27:28.785 [2024-07-23 17:21:24.006627] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:28.785 [2024-07-23 17:21:24.008027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:28.785 [2024-07-23 17:21:24.008056] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:28.785 BaseBdev4 00:27:28.785 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:27:29.042 [2024-07-23 17:21:24.251250] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:29.042 [2024-07-23 17:21:24.252501] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:29.042 [2024-07-23 17:21:24.252565] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:29.042 [2024-07-23 17:21:24.252625] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:29.042 [2024-07-23 17:21:24.252850] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e6a510 00:27:29.042 [2024-07-23 17:21:24.252862] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:29.042 [2024-07-23 17:21:24.253067] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dce980 00:27:29.042 [2024-07-23 17:21:24.253223] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e6a510 00:27:29.042 [2024-07-23 17:21:24.253233] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e6a510 00:27:29.042 [2024-07-23 17:21:24.253334] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.042 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.301 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.301 "name": "raid_bdev1", 00:27:29.301 "uuid": "2365d083-101a-4388-88f4-9bece3e71598", 00:27:29.301 "strip_size_kb": 0, 00:27:29.301 "state": "online", 00:27:29.301 "raid_level": "raid1", 00:27:29.301 "superblock": true, 00:27:29.301 "num_base_bdevs": 4, 00:27:29.301 "num_base_bdevs_discovered": 4, 00:27:29.301 "num_base_bdevs_operational": 4, 00:27:29.301 "base_bdevs_list": [ 00:27:29.301 { 00:27:29.301 "name": "BaseBdev1", 00:27:29.301 "uuid": "6da84b6d-0582-59fd-88f1-1347bc6ba843", 00:27:29.301 "is_configured": true, 00:27:29.301 "data_offset": 2048, 00:27:29.301 "data_size": 63488 00:27:29.301 }, 00:27:29.301 { 00:27:29.301 "name": "BaseBdev2", 00:27:29.301 "uuid": "99c4a157-90eb-59cc-a1d9-cd122817eced", 00:27:29.301 "is_configured": true, 00:27:29.301 "data_offset": 2048, 00:27:29.301 "data_size": 63488 00:27:29.301 }, 00:27:29.301 { 00:27:29.301 "name": "BaseBdev3", 00:27:29.301 "uuid": "d53a393f-3ffd-5d47-a518-e35ee7e5be1c", 00:27:29.301 "is_configured": true, 00:27:29.301 "data_offset": 2048, 00:27:29.301 "data_size": 63488 00:27:29.301 }, 00:27:29.301 { 00:27:29.301 "name": "BaseBdev4", 00:27:29.301 "uuid": "10a17f2c-4f19-53ce-8d90-d67d371da2ac", 00:27:29.301 "is_configured": true, 00:27:29.301 "data_offset": 2048, 00:27:29.301 "data_size": 63488 00:27:29.301 } 00:27:29.301 ] 00:27:29.301 }' 00:27:29.301 17:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.301 17:21:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:29.867 17:21:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:27:29.867 17:21:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:29.867 [2024-07-23 17:21:25.226087] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e86600 00:27:30.802 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.060 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.318 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.318 "name": "raid_bdev1", 00:27:31.318 "uuid": "2365d083-101a-4388-88f4-9bece3e71598", 00:27:31.318 "strip_size_kb": 0, 00:27:31.318 "state": "online", 00:27:31.318 "raid_level": "raid1", 00:27:31.318 "superblock": true, 00:27:31.318 "num_base_bdevs": 4, 00:27:31.318 "num_base_bdevs_discovered": 4, 00:27:31.318 "num_base_bdevs_operational": 4, 00:27:31.318 "base_bdevs_list": [ 00:27:31.318 { 00:27:31.318 "name": "BaseBdev1", 00:27:31.318 "uuid": "6da84b6d-0582-59fd-88f1-1347bc6ba843", 00:27:31.318 "is_configured": true, 00:27:31.318 "data_offset": 2048, 00:27:31.318 "data_size": 63488 00:27:31.318 }, 00:27:31.318 { 00:27:31.318 "name": "BaseBdev2", 00:27:31.318 "uuid": "99c4a157-90eb-59cc-a1d9-cd122817eced", 00:27:31.318 "is_configured": true, 00:27:31.318 "data_offset": 2048, 00:27:31.318 "data_size": 63488 00:27:31.318 }, 00:27:31.318 { 00:27:31.318 "name": "BaseBdev3", 00:27:31.318 "uuid": "d53a393f-3ffd-5d47-a518-e35ee7e5be1c", 00:27:31.318 "is_configured": true, 00:27:31.318 "data_offset": 2048, 00:27:31.318 "data_size": 63488 00:27:31.319 }, 00:27:31.319 { 00:27:31.319 "name": "BaseBdev4", 00:27:31.319 "uuid": "10a17f2c-4f19-53ce-8d90-d67d371da2ac", 00:27:31.319 "is_configured": true, 00:27:31.319 "data_offset": 2048, 00:27:31.319 "data_size": 63488 00:27:31.319 } 00:27:31.319 ] 00:27:31.319 }' 00:27:31.319 17:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.319 17:21:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:31.884 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:32.145 [2024-07-23 17:21:27.444881] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.145 [2024-07-23 17:21:27.444934] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:32.145 [2024-07-23 17:21:27.448176] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:32.145 [2024-07-23 17:21:27.448231] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:32.145 [2024-07-23 17:21:27.448350] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:32.145 [2024-07-23 17:21:27.448362] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e6a510 name raid_bdev1, state offline 00:27:32.145 0 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 24427 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 24427 ']' 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 24427 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 24427 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 24427' 00:27:32.145 killing process with pid 24427 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 24427 00:27:32.145 [2024-07-23 17:21:27.530702] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:32.145 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 24427 00:27:32.145 [2024-07-23 17:21:27.563323] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:32.441 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.QkwudhhHg1 00:27:32.441 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:27:32.442 00:27:32.442 real 0m7.106s 00:27:32.442 user 0m11.585s 00:27:32.442 sys 0m1.360s 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:32.442 17:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:32.442 ************************************ 00:27:32.442 END TEST raid_read_error_test 00:27:32.442 ************************************ 00:27:32.442 17:21:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:32.442 17:21:27 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:27:32.442 17:21:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:32.442 17:21:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:32.442 17:21:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:32.701 ************************************ 00:27:32.701 START TEST raid_write_error_test 00:27:32.701 ************************************ 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.aPUOxL5EdJ 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=25408 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 25408 /var/tmp/spdk-raid.sock 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 25408 ']' 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:32.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:32.701 17:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:32.701 [2024-07-23 17:21:27.966322] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:27:32.701 [2024-07-23 17:21:27.966400] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid25408 ] 00:27:32.701 [2024-07-23 17:21:28.103295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:32.960 [2024-07-23 17:21:28.158705] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:32.960 [2024-07-23 17:21:28.218939] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:32.960 [2024-07-23 17:21:28.218968] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:33.528 17:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:33.528 17:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:27:33.528 17:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:33.528 17:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:33.786 BaseBdev1_malloc 00:27:33.786 17:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:27:34.045 true 00:27:34.045 17:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:27:34.303 [2024-07-23 17:21:29.561717] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:27:34.303 [2024-07-23 17:21:29.561762] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:34.303 [2024-07-23 17:21:29.561781] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e95c0 00:27:34.303 [2024-07-23 17:21:29.561794] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:34.303 [2024-07-23 17:21:29.563268] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:34.303 [2024-07-23 17:21:29.563295] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:34.303 BaseBdev1 00:27:34.303 17:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:34.303 17:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:34.562 BaseBdev2_malloc 00:27:34.562 17:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:27:34.820 true 00:27:34.820 17:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:27:35.079 [2024-07-23 17:21:30.276241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:27:35.079 [2024-07-23 17:21:30.276294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:35.079 [2024-07-23 17:21:30.276318] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e3620 00:27:35.079 [2024-07-23 17:21:30.276331] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:35.079 [2024-07-23 17:21:30.277986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:35.079 [2024-07-23 17:21:30.278016] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:35.079 BaseBdev2 00:27:35.079 17:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:35.079 17:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:35.337 BaseBdev3_malloc 00:27:35.337 17:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:27:35.596 true 00:27:35.596 17:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:27:35.855 [2024-07-23 17:21:31.034881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:27:35.855 [2024-07-23 17:21:31.034936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:35.855 [2024-07-23 17:21:31.034960] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e3c00 00:27:35.855 [2024-07-23 17:21:31.034973] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:35.855 [2024-07-23 17:21:31.036447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:35.855 [2024-07-23 17:21:31.036476] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:35.855 BaseBdev3 00:27:35.855 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:27:35.855 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:36.114 BaseBdev4_malloc 00:27:36.114 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:27:36.114 true 00:27:36.114 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:27:36.373 [2024-07-23 17:21:31.717291] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:27:36.373 [2024-07-23 17:21:31.717335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.373 [2024-07-23 17:21:31.717354] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e69c0 00:27:36.373 [2024-07-23 17:21:31.717366] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.373 [2024-07-23 17:21:31.718748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.373 [2024-07-23 17:21:31.718776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:36.373 BaseBdev4 00:27:36.373 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:27:36.632 [2024-07-23 17:21:31.965990] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:36.632 [2024-07-23 17:21:31.967146] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:36.632 [2024-07-23 17:21:31.967207] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:36.632 [2024-07-23 17:21:31.967266] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:36.632 [2024-07-23 17:21:31.967487] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15d1510 00:27:36.632 [2024-07-23 17:21:31.967498] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:36.632 [2024-07-23 17:21:31.967678] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1535980 00:27:36.632 [2024-07-23 17:21:31.967828] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15d1510 00:27:36.632 [2024-07-23 17:21:31.967839] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15d1510 00:27:36.632 [2024-07-23 17:21:31.967943] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.632 17:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.890 17:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.890 "name": "raid_bdev1", 00:27:36.890 "uuid": "d8a7aed2-e303-4b53-a30a-b793e2bf4d10", 00:27:36.890 "strip_size_kb": 0, 00:27:36.890 "state": "online", 00:27:36.890 "raid_level": "raid1", 00:27:36.890 "superblock": true, 00:27:36.890 "num_base_bdevs": 4, 00:27:36.890 "num_base_bdevs_discovered": 4, 00:27:36.890 "num_base_bdevs_operational": 4, 00:27:36.890 "base_bdevs_list": [ 00:27:36.890 { 00:27:36.890 "name": "BaseBdev1", 00:27:36.890 "uuid": "84b35055-1647-5597-8890-6e0b6c1c9d59", 00:27:36.890 "is_configured": true, 00:27:36.890 "data_offset": 2048, 00:27:36.890 "data_size": 63488 00:27:36.890 }, 00:27:36.890 { 00:27:36.890 "name": "BaseBdev2", 00:27:36.890 "uuid": "ce0dada5-d3ea-5832-924b-b7af9c978592", 00:27:36.890 "is_configured": true, 00:27:36.890 "data_offset": 2048, 00:27:36.890 "data_size": 63488 00:27:36.890 }, 00:27:36.890 { 00:27:36.890 "name": "BaseBdev3", 00:27:36.890 "uuid": "b1d63c54-5499-5d69-abd7-d23451f3bc44", 00:27:36.890 "is_configured": true, 00:27:36.890 "data_offset": 2048, 00:27:36.890 "data_size": 63488 00:27:36.890 }, 00:27:36.890 { 00:27:36.890 "name": "BaseBdev4", 00:27:36.890 "uuid": "12325307-6d39-5bff-b8e5-ec9fcf9f2b41", 00:27:36.890 "is_configured": true, 00:27:36.890 "data_offset": 2048, 00:27:36.890 "data_size": 63488 00:27:36.890 } 00:27:36.890 ] 00:27:36.890 }' 00:27:36.890 17:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.890 17:21:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:37.458 17:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:37.458 17:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:27:37.716 [2024-07-23 17:21:32.884687] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ed600 00:27:38.653 17:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:27:38.653 [2024-07-23 17:21:34.046826] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:27:38.654 [2024-07-23 17:21:34.046886] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:38.654 [2024-07-23 17:21:34.047116] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15ed600 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:38.654 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:38.913 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.913 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.913 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:38.913 "name": "raid_bdev1", 00:27:38.913 "uuid": "d8a7aed2-e303-4b53-a30a-b793e2bf4d10", 00:27:38.913 "strip_size_kb": 0, 00:27:38.913 "state": "online", 00:27:38.913 "raid_level": "raid1", 00:27:38.913 "superblock": true, 00:27:38.913 "num_base_bdevs": 4, 00:27:38.913 "num_base_bdevs_discovered": 3, 00:27:38.913 "num_base_bdevs_operational": 3, 00:27:38.913 "base_bdevs_list": [ 00:27:38.913 { 00:27:38.913 "name": null, 00:27:38.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:38.913 "is_configured": false, 00:27:38.913 "data_offset": 2048, 00:27:38.913 "data_size": 63488 00:27:38.913 }, 00:27:38.913 { 00:27:38.913 "name": "BaseBdev2", 00:27:38.913 "uuid": "ce0dada5-d3ea-5832-924b-b7af9c978592", 00:27:38.913 "is_configured": true, 00:27:38.913 "data_offset": 2048, 00:27:38.913 "data_size": 63488 00:27:38.913 }, 00:27:38.913 { 00:27:38.913 "name": "BaseBdev3", 00:27:38.913 "uuid": "b1d63c54-5499-5d69-abd7-d23451f3bc44", 00:27:38.913 "is_configured": true, 00:27:38.913 "data_offset": 2048, 00:27:38.913 "data_size": 63488 00:27:38.913 }, 00:27:38.913 { 00:27:38.913 "name": "BaseBdev4", 00:27:38.913 "uuid": "12325307-6d39-5bff-b8e5-ec9fcf9f2b41", 00:27:38.913 "is_configured": true, 00:27:38.913 "data_offset": 2048, 00:27:38.913 "data_size": 63488 00:27:38.913 } 00:27:38.913 ] 00:27:38.913 }' 00:27:38.913 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:38.914 17:21:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:39.481 17:21:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:39.739 [2024-07-23 17:21:35.106451] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:39.739 [2024-07-23 17:21:35.106493] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:39.739 [2024-07-23 17:21:35.109652] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:39.739 [2024-07-23 17:21:35.109691] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:39.739 [2024-07-23 17:21:35.109791] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:39.739 [2024-07-23 17:21:35.109803] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15d1510 name raid_bdev1, state offline 00:27:39.739 0 00:27:39.739 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 25408 00:27:39.739 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 25408 ']' 00:27:39.739 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 25408 00:27:39.739 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:27:39.739 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:39.740 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 25408 00:27:39.998 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:39.998 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:39.998 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 25408' 00:27:39.998 killing process with pid 25408 00:27:39.998 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 25408 00:27:39.998 [2024-07-23 17:21:35.180128] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:39.998 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 25408 00:27:39.998 [2024-07-23 17:21:35.215539] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.aPUOxL5EdJ 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:27:40.257 00:27:40.257 real 0m7.563s 00:27:40.257 user 0m12.003s 00:27:40.257 sys 0m1.407s 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:40.257 17:21:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:27:40.257 ************************************ 00:27:40.257 END TEST raid_write_error_test 00:27:40.257 ************************************ 00:27:40.257 17:21:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:40.257 17:21:35 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:27:40.257 17:21:35 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:27:40.257 17:21:35 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:27:40.257 17:21:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:40.257 17:21:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:40.257 17:21:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:40.257 ************************************ 00:27:40.257 START TEST raid_rebuild_test 00:27:40.257 ************************************ 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=26555 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 26555 /var/tmp/spdk-raid.sock 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 26555 ']' 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:40.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:40.258 17:21:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:27:40.258 [2024-07-23 17:21:35.607516] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:27:40.258 [2024-07-23 17:21:35.607582] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid26555 ] 00:27:40.258 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:40.258 Zero copy mechanism will not be used. 00:27:40.516 [2024-07-23 17:21:35.739358] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:40.516 [2024-07-23 17:21:35.795238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:40.516 [2024-07-23 17:21:35.861277] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:40.516 [2024-07-23 17:21:35.861316] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:41.450 17:21:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:41.450 17:21:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:27:41.450 17:21:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:41.450 17:21:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:41.450 BaseBdev1_malloc 00:27:41.450 17:21:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:41.709 [2024-07-23 17:21:37.024922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:41.709 [2024-07-23 17:21:37.024973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:41.709 [2024-07-23 17:21:37.024999] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125a170 00:27:41.709 [2024-07-23 17:21:37.025011] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:41.709 [2024-07-23 17:21:37.026660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:41.709 [2024-07-23 17:21:37.026689] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:41.709 BaseBdev1 00:27:41.709 17:21:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:41.709 17:21:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:41.968 BaseBdev2_malloc 00:27:41.968 17:21:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:42.227 [2024-07-23 17:21:37.524206] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:42.227 [2024-07-23 17:21:37.524256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:42.227 [2024-07-23 17:21:37.524276] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1140680 00:27:42.227 [2024-07-23 17:21:37.524294] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:42.227 [2024-07-23 17:21:37.525870] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:42.227 [2024-07-23 17:21:37.525907] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:42.227 BaseBdev2 00:27:42.227 17:21:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:42.485 spare_malloc 00:27:42.485 17:21:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:42.744 spare_delay 00:27:42.744 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:43.002 [2024-07-23 17:21:38.267967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:43.002 [2024-07-23 17:21:38.268014] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:43.002 [2024-07-23 17:21:38.268035] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11442a0 00:27:43.002 [2024-07-23 17:21:38.268047] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:43.002 [2024-07-23 17:21:38.269592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:43.002 [2024-07-23 17:21:38.269621] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:43.002 spare 00:27:43.002 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:43.260 [2024-07-23 17:21:38.508618] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:43.260 [2024-07-23 17:21:38.509913] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:43.260 [2024-07-23 17:21:38.509994] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1143ea0 00:27:43.260 [2024-07-23 17:21:38.510006] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:27:43.261 [2024-07-23 17:21:38.510213] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11472d0 00:27:43.261 [2024-07-23 17:21:38.510356] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1143ea0 00:27:43.261 [2024-07-23 17:21:38.510367] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1143ea0 00:27:43.261 [2024-07-23 17:21:38.510481] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.261 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.519 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.519 "name": "raid_bdev1", 00:27:43.519 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:43.519 "strip_size_kb": 0, 00:27:43.519 "state": "online", 00:27:43.519 "raid_level": "raid1", 00:27:43.519 "superblock": false, 00:27:43.519 "num_base_bdevs": 2, 00:27:43.519 "num_base_bdevs_discovered": 2, 00:27:43.519 "num_base_bdevs_operational": 2, 00:27:43.519 "base_bdevs_list": [ 00:27:43.519 { 00:27:43.519 "name": "BaseBdev1", 00:27:43.519 "uuid": "4c431277-61cc-5cf0-8bdb-4b98217e8a36", 00:27:43.519 "is_configured": true, 00:27:43.519 "data_offset": 0, 00:27:43.519 "data_size": 65536 00:27:43.519 }, 00:27:43.519 { 00:27:43.519 "name": "BaseBdev2", 00:27:43.519 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:43.519 "is_configured": true, 00:27:43.519 "data_offset": 0, 00:27:43.519 "data_size": 65536 00:27:43.519 } 00:27:43.519 ] 00:27:43.519 }' 00:27:43.519 17:21:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.519 17:21:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:27:44.086 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:44.086 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:44.344 [2024-07-23 17:21:39.599738] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:44.344 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:27:44.344 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.344 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:44.602 17:21:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:44.861 [2024-07-23 17:21:40.100863] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10a9a90 00:27:44.861 /dev/nbd0 00:27:44.861 17:21:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:44.862 1+0 records in 00:27:44.862 1+0 records out 00:27:44.862 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251316 s, 16.3 MB/s 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:44.862 17:21:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:27:51.467 65536+0 records in 00:27:51.467 65536+0 records out 00:27:51.467 33554432 bytes (34 MB, 32 MiB) copied, 6.14435 s, 5.5 MB/s 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:51.467 [2024-07-23 17:21:46.582104] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:51.467 [2024-07-23 17:21:46.750596] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.467 17:21:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:51.726 17:21:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:51.726 "name": "raid_bdev1", 00:27:51.726 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:51.726 "strip_size_kb": 0, 00:27:51.726 "state": "online", 00:27:51.726 "raid_level": "raid1", 00:27:51.726 "superblock": false, 00:27:51.726 "num_base_bdevs": 2, 00:27:51.726 "num_base_bdevs_discovered": 1, 00:27:51.726 "num_base_bdevs_operational": 1, 00:27:51.726 "base_bdevs_list": [ 00:27:51.726 { 00:27:51.726 "name": null, 00:27:51.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:51.726 "is_configured": false, 00:27:51.726 "data_offset": 0, 00:27:51.726 "data_size": 65536 00:27:51.726 }, 00:27:51.726 { 00:27:51.726 "name": "BaseBdev2", 00:27:51.726 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:51.726 "is_configured": true, 00:27:51.726 "data_offset": 0, 00:27:51.726 "data_size": 65536 00:27:51.726 } 00:27:51.726 ] 00:27:51.726 }' 00:27:51.726 17:21:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:51.726 17:21:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:27:52.290 17:21:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:52.548 [2024-07-23 17:21:47.889636] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:52.548 [2024-07-23 17:21:47.895183] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10a9a80 00:27:52.548 [2024-07-23 17:21:47.897516] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:52.548 17:21:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.921 17:21:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:53.921 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:53.921 "name": "raid_bdev1", 00:27:53.921 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:53.921 "strip_size_kb": 0, 00:27:53.921 "state": "online", 00:27:53.921 "raid_level": "raid1", 00:27:53.921 "superblock": false, 00:27:53.921 "num_base_bdevs": 2, 00:27:53.921 "num_base_bdevs_discovered": 2, 00:27:53.921 "num_base_bdevs_operational": 2, 00:27:53.921 "process": { 00:27:53.921 "type": "rebuild", 00:27:53.921 "target": "spare", 00:27:53.921 "progress": { 00:27:53.921 "blocks": 24576, 00:27:53.921 "percent": 37 00:27:53.921 } 00:27:53.921 }, 00:27:53.921 "base_bdevs_list": [ 00:27:53.921 { 00:27:53.921 "name": "spare", 00:27:53.921 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:27:53.921 "is_configured": true, 00:27:53.921 "data_offset": 0, 00:27:53.921 "data_size": 65536 00:27:53.921 }, 00:27:53.921 { 00:27:53.921 "name": "BaseBdev2", 00:27:53.921 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:53.921 "is_configured": true, 00:27:53.921 "data_offset": 0, 00:27:53.921 "data_size": 65536 00:27:53.921 } 00:27:53.921 ] 00:27:53.921 }' 00:27:53.921 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:53.921 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:53.921 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:53.921 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:53.921 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:54.179 [2024-07-23 17:21:49.483932] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.179 [2024-07-23 17:21:49.510553] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:54.179 [2024-07-23 17:21:49.510600] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.179 [2024-07-23 17:21:49.510615] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.179 [2024-07-23 17:21:49.510623] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.179 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.437 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.437 "name": "raid_bdev1", 00:27:54.437 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:54.437 "strip_size_kb": 0, 00:27:54.437 "state": "online", 00:27:54.437 "raid_level": "raid1", 00:27:54.437 "superblock": false, 00:27:54.437 "num_base_bdevs": 2, 00:27:54.437 "num_base_bdevs_discovered": 1, 00:27:54.437 "num_base_bdevs_operational": 1, 00:27:54.437 "base_bdevs_list": [ 00:27:54.437 { 00:27:54.437 "name": null, 00:27:54.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.437 "is_configured": false, 00:27:54.437 "data_offset": 0, 00:27:54.437 "data_size": 65536 00:27:54.437 }, 00:27:54.437 { 00:27:54.437 "name": "BaseBdev2", 00:27:54.437 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:54.437 "is_configured": true, 00:27:54.437 "data_offset": 0, 00:27:54.437 "data_size": 65536 00:27:54.437 } 00:27:54.437 ] 00:27:54.437 }' 00:27:54.437 17:21:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.437 17:21:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.003 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.260 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:55.260 "name": "raid_bdev1", 00:27:55.260 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:55.260 "strip_size_kb": 0, 00:27:55.260 "state": "online", 00:27:55.261 "raid_level": "raid1", 00:27:55.261 "superblock": false, 00:27:55.261 "num_base_bdevs": 2, 00:27:55.261 "num_base_bdevs_discovered": 1, 00:27:55.261 "num_base_bdevs_operational": 1, 00:27:55.261 "base_bdevs_list": [ 00:27:55.261 { 00:27:55.261 "name": null, 00:27:55.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.261 "is_configured": false, 00:27:55.261 "data_offset": 0, 00:27:55.261 "data_size": 65536 00:27:55.261 }, 00:27:55.261 { 00:27:55.261 "name": "BaseBdev2", 00:27:55.261 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:55.261 "is_configured": true, 00:27:55.261 "data_offset": 0, 00:27:55.261 "data_size": 65536 00:27:55.261 } 00:27:55.261 ] 00:27:55.261 }' 00:27:55.261 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:55.261 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:55.261 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:55.261 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:55.261 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:55.518 [2024-07-23 17:21:50.863371] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:55.518 [2024-07-23 17:21:50.868210] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1255cd0 00:27:55.518 [2024-07-23 17:21:50.869640] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:55.518 17:21:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:56.890 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:56.890 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.890 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:56.890 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:56.890 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.891 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.891 17:21:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.891 "name": "raid_bdev1", 00:27:56.891 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:56.891 "strip_size_kb": 0, 00:27:56.891 "state": "online", 00:27:56.891 "raid_level": "raid1", 00:27:56.891 "superblock": false, 00:27:56.891 "num_base_bdevs": 2, 00:27:56.891 "num_base_bdevs_discovered": 2, 00:27:56.891 "num_base_bdevs_operational": 2, 00:27:56.891 "process": { 00:27:56.891 "type": "rebuild", 00:27:56.891 "target": "spare", 00:27:56.891 "progress": { 00:27:56.891 "blocks": 24576, 00:27:56.891 "percent": 37 00:27:56.891 } 00:27:56.891 }, 00:27:56.891 "base_bdevs_list": [ 00:27:56.891 { 00:27:56.891 "name": "spare", 00:27:56.891 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:27:56.891 "is_configured": true, 00:27:56.891 "data_offset": 0, 00:27:56.891 "data_size": 65536 00:27:56.891 }, 00:27:56.891 { 00:27:56.891 "name": "BaseBdev2", 00:27:56.891 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:56.891 "is_configured": true, 00:27:56.891 "data_offset": 0, 00:27:56.891 "data_size": 65536 00:27:56.891 } 00:27:56.891 ] 00:27:56.891 }' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=812 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.891 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.149 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.149 "name": "raid_bdev1", 00:27:57.149 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:57.149 "strip_size_kb": 0, 00:27:57.149 "state": "online", 00:27:57.149 "raid_level": "raid1", 00:27:57.149 "superblock": false, 00:27:57.149 "num_base_bdevs": 2, 00:27:57.149 "num_base_bdevs_discovered": 2, 00:27:57.149 "num_base_bdevs_operational": 2, 00:27:57.149 "process": { 00:27:57.149 "type": "rebuild", 00:27:57.149 "target": "spare", 00:27:57.149 "progress": { 00:27:57.149 "blocks": 30720, 00:27:57.149 "percent": 46 00:27:57.149 } 00:27:57.149 }, 00:27:57.149 "base_bdevs_list": [ 00:27:57.149 { 00:27:57.149 "name": "spare", 00:27:57.149 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:27:57.149 "is_configured": true, 00:27:57.149 "data_offset": 0, 00:27:57.149 "data_size": 65536 00:27:57.149 }, 00:27:57.149 { 00:27:57.149 "name": "BaseBdev2", 00:27:57.149 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:57.149 "is_configured": true, 00:27:57.149 "data_offset": 0, 00:27:57.149 "data_size": 65536 00:27:57.149 } 00:27:57.149 ] 00:27:57.149 }' 00:27:57.149 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.149 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:57.149 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:57.406 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:57.406 17:21:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.337 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.594 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:58.594 "name": "raid_bdev1", 00:27:58.594 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:58.594 "strip_size_kb": 0, 00:27:58.594 "state": "online", 00:27:58.594 "raid_level": "raid1", 00:27:58.594 "superblock": false, 00:27:58.594 "num_base_bdevs": 2, 00:27:58.594 "num_base_bdevs_discovered": 2, 00:27:58.594 "num_base_bdevs_operational": 2, 00:27:58.594 "process": { 00:27:58.594 "type": "rebuild", 00:27:58.594 "target": "spare", 00:27:58.594 "progress": { 00:27:58.594 "blocks": 57344, 00:27:58.594 "percent": 87 00:27:58.594 } 00:27:58.594 }, 00:27:58.594 "base_bdevs_list": [ 00:27:58.594 { 00:27:58.594 "name": "spare", 00:27:58.594 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:27:58.594 "is_configured": true, 00:27:58.594 "data_offset": 0, 00:27:58.594 "data_size": 65536 00:27:58.594 }, 00:27:58.594 { 00:27:58.594 "name": "BaseBdev2", 00:27:58.594 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:58.594 "is_configured": true, 00:27:58.594 "data_offset": 0, 00:27:58.594 "data_size": 65536 00:27:58.594 } 00:27:58.594 ] 00:27:58.594 }' 00:27:58.594 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:58.594 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:58.594 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.594 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:58.594 17:21:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:58.852 [2024-07-23 17:21:54.094787] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:58.852 [2024-07-23 17:21:54.094847] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:58.852 [2024-07-23 17:21:54.094888] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:59.784 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.785 17:21:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.785 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.785 "name": "raid_bdev1", 00:27:59.785 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:27:59.785 "strip_size_kb": 0, 00:27:59.785 "state": "online", 00:27:59.785 "raid_level": "raid1", 00:27:59.785 "superblock": false, 00:27:59.785 "num_base_bdevs": 2, 00:27:59.785 "num_base_bdevs_discovered": 2, 00:27:59.785 "num_base_bdevs_operational": 2, 00:27:59.785 "base_bdevs_list": [ 00:27:59.785 { 00:27:59.785 "name": "spare", 00:27:59.785 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:27:59.785 "is_configured": true, 00:27:59.785 "data_offset": 0, 00:27:59.785 "data_size": 65536 00:27:59.785 }, 00:27:59.785 { 00:27:59.785 "name": "BaseBdev2", 00:27:59.785 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:27:59.785 "is_configured": true, 00:27:59.785 "data_offset": 0, 00:27:59.785 "data_size": 65536 00:27:59.785 } 00:27:59.785 ] 00:27:59.785 }' 00:27:59.785 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.785 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:59.785 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.042 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.300 "name": "raid_bdev1", 00:28:00.300 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:28:00.300 "strip_size_kb": 0, 00:28:00.300 "state": "online", 00:28:00.300 "raid_level": "raid1", 00:28:00.300 "superblock": false, 00:28:00.300 "num_base_bdevs": 2, 00:28:00.300 "num_base_bdevs_discovered": 2, 00:28:00.300 "num_base_bdevs_operational": 2, 00:28:00.300 "base_bdevs_list": [ 00:28:00.300 { 00:28:00.300 "name": "spare", 00:28:00.300 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:28:00.300 "is_configured": true, 00:28:00.300 "data_offset": 0, 00:28:00.300 "data_size": 65536 00:28:00.300 }, 00:28:00.300 { 00:28:00.300 "name": "BaseBdev2", 00:28:00.300 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:28:00.300 "is_configured": true, 00:28:00.300 "data_offset": 0, 00:28:00.300 "data_size": 65536 00:28:00.300 } 00:28:00.300 ] 00:28:00.300 }' 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.300 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.558 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.558 "name": "raid_bdev1", 00:28:00.558 "uuid": "22093822-1d4b-4ff3-bcda-95029e58eeac", 00:28:00.558 "strip_size_kb": 0, 00:28:00.558 "state": "online", 00:28:00.558 "raid_level": "raid1", 00:28:00.558 "superblock": false, 00:28:00.558 "num_base_bdevs": 2, 00:28:00.558 "num_base_bdevs_discovered": 2, 00:28:00.558 "num_base_bdevs_operational": 2, 00:28:00.558 "base_bdevs_list": [ 00:28:00.558 { 00:28:00.558 "name": "spare", 00:28:00.558 "uuid": "a1e6436f-b545-54ff-8afa-12996a8e80db", 00:28:00.558 "is_configured": true, 00:28:00.558 "data_offset": 0, 00:28:00.558 "data_size": 65536 00:28:00.558 }, 00:28:00.558 { 00:28:00.558 "name": "BaseBdev2", 00:28:00.558 "uuid": "ccd18d0a-cbac-5f09-a83b-8d2351c28572", 00:28:00.558 "is_configured": true, 00:28:00.558 "data_offset": 0, 00:28:00.558 "data_size": 65536 00:28:00.558 } 00:28:00.558 ] 00:28:00.558 }' 00:28:00.558 17:21:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.558 17:21:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:28:01.123 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:01.381 [2024-07-23 17:21:56.642635] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:01.381 [2024-07-23 17:21:56.642663] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:01.381 [2024-07-23 17:21:56.642719] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:01.381 [2024-07-23 17:21:56.642775] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:01.381 [2024-07-23 17:21:56.642787] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1143ea0 name raid_bdev1, state offline 00:28:01.381 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:28:01.381 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.638 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:01.638 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:01.638 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:01.639 17:21:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:01.895 /dev/nbd0 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:01.895 1+0 records in 00:28:01.895 1+0 records out 00:28:01.895 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216624 s, 18.9 MB/s 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:01.895 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:02.152 /dev/nbd1 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.152 1+0 records in 00:28:02.152 1+0 records out 00:28:02.152 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334016 s, 12.3 MB/s 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:02.152 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:02.718 17:21:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 26555 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 26555 ']' 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 26555 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:02.718 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 26555 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 26555' 00:28:02.976 killing process with pid 26555 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 26555 00:28:02.976 Received shutdown signal, test time was about 60.000000 seconds 00:28:02.976 00:28:02.976 Latency(us) 00:28:02.976 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:02.976 =================================================================================================================== 00:28:02.976 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:02.976 [2024-07-23 17:21:58.171975] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 26555 00:28:02.976 [2024-07-23 17:21:58.197889] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:28:02.976 00:28:02.976 real 0m22.850s 00:28:02.976 user 0m29.878s 00:28:02.976 sys 0m5.541s 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:02.976 17:21:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:28:02.976 ************************************ 00:28:02.976 END TEST raid_rebuild_test 00:28:02.976 ************************************ 00:28:03.235 17:21:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:03.235 17:21:58 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:28:03.235 17:21:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:03.235 17:21:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:03.235 17:21:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:03.235 ************************************ 00:28:03.235 START TEST raid_rebuild_test_sb 00:28:03.235 ************************************ 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=29611 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 29611 /var/tmp/spdk-raid.sock 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 29611 ']' 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:03.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:03.235 17:21:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:03.235 [2024-07-23 17:21:58.540910] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:28:03.235 [2024-07-23 17:21:58.540975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid29611 ] 00:28:03.235 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:03.235 Zero copy mechanism will not be used. 00:28:03.494 [2024-07-23 17:21:58.671500] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:03.494 [2024-07-23 17:21:58.723435] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:03.494 [2024-07-23 17:21:58.783332] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:03.494 [2024-07-23 17:21:58.783369] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:04.059 17:21:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:04.059 17:21:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:28:04.059 17:21:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:04.059 17:21:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:04.317 BaseBdev1_malloc 00:28:04.317 17:21:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:04.575 [2024-07-23 17:21:59.929443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:04.575 [2024-07-23 17:21:59.929496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:04.575 [2024-07-23 17:21:59.929523] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2371170 00:28:04.575 [2024-07-23 17:21:59.929535] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:04.575 [2024-07-23 17:21:59.931189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:04.575 [2024-07-23 17:21:59.931219] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:04.575 BaseBdev1 00:28:04.575 17:21:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:04.575 17:21:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:04.833 BaseBdev2_malloc 00:28:04.833 17:22:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:05.091 [2024-07-23 17:22:00.420680] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:05.091 [2024-07-23 17:22:00.420727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:05.091 [2024-07-23 17:22:00.420749] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2257680 00:28:05.091 [2024-07-23 17:22:00.420762] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:05.091 [2024-07-23 17:22:00.422285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:05.091 [2024-07-23 17:22:00.422319] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:05.091 BaseBdev2 00:28:05.091 17:22:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:28:05.349 spare_malloc 00:28:05.349 17:22:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:05.610 spare_delay 00:28:05.610 17:22:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:05.888 [2024-07-23 17:22:01.155162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:05.888 [2024-07-23 17:22:01.155209] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:05.888 [2024-07-23 17:22:01.155230] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225b2a0 00:28:05.888 [2024-07-23 17:22:01.155243] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:05.888 [2024-07-23 17:22:01.156805] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:05.888 [2024-07-23 17:22:01.156834] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:05.888 spare 00:28:05.888 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:06.160 [2024-07-23 17:22:01.399843] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:06.160 [2024-07-23 17:22:01.401238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:06.160 [2024-07-23 17:22:01.401407] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x225aea0 00:28:06.160 [2024-07-23 17:22:01.401420] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:06.160 [2024-07-23 17:22:01.401623] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225d010 00:28:06.160 [2024-07-23 17:22:01.401767] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x225aea0 00:28:06.160 [2024-07-23 17:22:01.401777] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x225aea0 00:28:06.160 [2024-07-23 17:22:01.401880] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.160 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.418 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.418 "name": "raid_bdev1", 00:28:06.418 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:06.418 "strip_size_kb": 0, 00:28:06.418 "state": "online", 00:28:06.418 "raid_level": "raid1", 00:28:06.418 "superblock": true, 00:28:06.418 "num_base_bdevs": 2, 00:28:06.418 "num_base_bdevs_discovered": 2, 00:28:06.418 "num_base_bdevs_operational": 2, 00:28:06.418 "base_bdevs_list": [ 00:28:06.418 { 00:28:06.418 "name": "BaseBdev1", 00:28:06.418 "uuid": "19b75c10-48b9-5a84-b02d-75c099741bb3", 00:28:06.418 "is_configured": true, 00:28:06.418 "data_offset": 2048, 00:28:06.418 "data_size": 63488 00:28:06.418 }, 00:28:06.418 { 00:28:06.418 "name": "BaseBdev2", 00:28:06.418 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:06.418 "is_configured": true, 00:28:06.418 "data_offset": 2048, 00:28:06.418 "data_size": 63488 00:28:06.418 } 00:28:06.418 ] 00:28:06.418 }' 00:28:06.418 17:22:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.418 17:22:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:06.983 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:06.983 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:07.240 [2024-07-23 17:22:02.511012] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:07.240 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:28:07.240 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.240 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:07.498 17:22:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:07.756 [2024-07-23 17:22:03.024159] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21bece0 00:28:07.756 /dev/nbd0 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:07.756 1+0 records in 00:28:07.756 1+0 records out 00:28:07.756 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267988 s, 15.3 MB/s 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:07.756 17:22:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:28:14.313 63488+0 records in 00:28:14.313 63488+0 records out 00:28:14.313 32505856 bytes (33 MB, 31 MiB) copied, 6.07724 s, 5.3 MB/s 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:14.313 [2024-07-23 17:22:09.438632] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:14.313 [2024-07-23 17:22:09.671327] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.313 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.571 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.571 "name": "raid_bdev1", 00:28:14.571 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:14.571 "strip_size_kb": 0, 00:28:14.571 "state": "online", 00:28:14.571 "raid_level": "raid1", 00:28:14.571 "superblock": true, 00:28:14.571 "num_base_bdevs": 2, 00:28:14.571 "num_base_bdevs_discovered": 1, 00:28:14.571 "num_base_bdevs_operational": 1, 00:28:14.571 "base_bdevs_list": [ 00:28:14.571 { 00:28:14.571 "name": null, 00:28:14.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.571 "is_configured": false, 00:28:14.571 "data_offset": 2048, 00:28:14.571 "data_size": 63488 00:28:14.571 }, 00:28:14.571 { 00:28:14.571 "name": "BaseBdev2", 00:28:14.571 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:14.571 "is_configured": true, 00:28:14.571 "data_offset": 2048, 00:28:14.571 "data_size": 63488 00:28:14.571 } 00:28:14.571 ] 00:28:14.571 }' 00:28:14.571 17:22:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.571 17:22:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:15.136 17:22:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:15.393 [2024-07-23 17:22:10.770228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:15.393 [2024-07-23 17:22:10.775103] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225bbe0 00:28:15.393 [2024-07-23 17:22:10.777384] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:15.393 17:22:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.765 17:22:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.765 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:16.765 "name": "raid_bdev1", 00:28:16.765 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:16.765 "strip_size_kb": 0, 00:28:16.765 "state": "online", 00:28:16.765 "raid_level": "raid1", 00:28:16.765 "superblock": true, 00:28:16.765 "num_base_bdevs": 2, 00:28:16.765 "num_base_bdevs_discovered": 2, 00:28:16.765 "num_base_bdevs_operational": 2, 00:28:16.765 "process": { 00:28:16.765 "type": "rebuild", 00:28:16.765 "target": "spare", 00:28:16.765 "progress": { 00:28:16.765 "blocks": 24576, 00:28:16.765 "percent": 38 00:28:16.765 } 00:28:16.765 }, 00:28:16.765 "base_bdevs_list": [ 00:28:16.765 { 00:28:16.765 "name": "spare", 00:28:16.765 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:16.765 "is_configured": true, 00:28:16.765 "data_offset": 2048, 00:28:16.765 "data_size": 63488 00:28:16.765 }, 00:28:16.765 { 00:28:16.765 "name": "BaseBdev2", 00:28:16.765 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:16.765 "is_configured": true, 00:28:16.765 "data_offset": 2048, 00:28:16.765 "data_size": 63488 00:28:16.765 } 00:28:16.765 ] 00:28:16.765 }' 00:28:16.765 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:16.765 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:16.765 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:16.765 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:16.765 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:17.023 [2024-07-23 17:22:12.355734] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.023 [2024-07-23 17:22:12.390053] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:17.023 [2024-07-23 17:22:12.390096] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:17.023 [2024-07-23 17:22:12.390111] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.023 [2024-07-23 17:22:12.390119] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:17.023 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.024 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.024 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.024 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.024 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.024 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.282 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:17.282 "name": "raid_bdev1", 00:28:17.282 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:17.282 "strip_size_kb": 0, 00:28:17.282 "state": "online", 00:28:17.282 "raid_level": "raid1", 00:28:17.282 "superblock": true, 00:28:17.282 "num_base_bdevs": 2, 00:28:17.282 "num_base_bdevs_discovered": 1, 00:28:17.282 "num_base_bdevs_operational": 1, 00:28:17.282 "base_bdevs_list": [ 00:28:17.282 { 00:28:17.282 "name": null, 00:28:17.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.282 "is_configured": false, 00:28:17.282 "data_offset": 2048, 00:28:17.282 "data_size": 63488 00:28:17.282 }, 00:28:17.282 { 00:28:17.282 "name": "BaseBdev2", 00:28:17.282 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:17.282 "is_configured": true, 00:28:17.282 "data_offset": 2048, 00:28:17.282 "data_size": 63488 00:28:17.282 } 00:28:17.282 ] 00:28:17.282 }' 00:28:17.282 17:22:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:17.282 17:22:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:18.218 "name": "raid_bdev1", 00:28:18.218 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:18.218 "strip_size_kb": 0, 00:28:18.218 "state": "online", 00:28:18.218 "raid_level": "raid1", 00:28:18.218 "superblock": true, 00:28:18.218 "num_base_bdevs": 2, 00:28:18.218 "num_base_bdevs_discovered": 1, 00:28:18.218 "num_base_bdevs_operational": 1, 00:28:18.218 "base_bdevs_list": [ 00:28:18.218 { 00:28:18.218 "name": null, 00:28:18.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:18.218 "is_configured": false, 00:28:18.218 "data_offset": 2048, 00:28:18.218 "data_size": 63488 00:28:18.218 }, 00:28:18.218 { 00:28:18.218 "name": "BaseBdev2", 00:28:18.218 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:18.218 "is_configured": true, 00:28:18.218 "data_offset": 2048, 00:28:18.218 "data_size": 63488 00:28:18.218 } 00:28:18.218 ] 00:28:18.218 }' 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:18.218 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:18.477 [2024-07-23 17:22:13.854306] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:18.477 [2024-07-23 17:22:13.859848] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21bece0 00:28:18.477 [2024-07-23 17:22:13.861344] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:18.477 17:22:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.852 17:22:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.852 "name": "raid_bdev1", 00:28:19.852 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:19.852 "strip_size_kb": 0, 00:28:19.852 "state": "online", 00:28:19.852 "raid_level": "raid1", 00:28:19.852 "superblock": true, 00:28:19.852 "num_base_bdevs": 2, 00:28:19.852 "num_base_bdevs_discovered": 2, 00:28:19.852 "num_base_bdevs_operational": 2, 00:28:19.852 "process": { 00:28:19.852 "type": "rebuild", 00:28:19.852 "target": "spare", 00:28:19.852 "progress": { 00:28:19.852 "blocks": 22528, 00:28:19.852 "percent": 35 00:28:19.852 } 00:28:19.852 }, 00:28:19.852 "base_bdevs_list": [ 00:28:19.852 { 00:28:19.852 "name": "spare", 00:28:19.852 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:19.852 "is_configured": true, 00:28:19.852 "data_offset": 2048, 00:28:19.852 "data_size": 63488 00:28:19.852 }, 00:28:19.852 { 00:28:19.852 "name": "BaseBdev2", 00:28:19.852 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:19.852 "is_configured": true, 00:28:19.852 "data_offset": 2048, 00:28:19.852 "data_size": 63488 00:28:19.852 } 00:28:19.852 ] 00:28:19.852 }' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:19.852 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=835 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.852 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.111 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:20.111 "name": "raid_bdev1", 00:28:20.111 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:20.111 "strip_size_kb": 0, 00:28:20.111 "state": "online", 00:28:20.111 "raid_level": "raid1", 00:28:20.111 "superblock": true, 00:28:20.111 "num_base_bdevs": 2, 00:28:20.111 "num_base_bdevs_discovered": 2, 00:28:20.111 "num_base_bdevs_operational": 2, 00:28:20.111 "process": { 00:28:20.111 "type": "rebuild", 00:28:20.111 "target": "spare", 00:28:20.111 "progress": { 00:28:20.111 "blocks": 30720, 00:28:20.111 "percent": 48 00:28:20.111 } 00:28:20.111 }, 00:28:20.111 "base_bdevs_list": [ 00:28:20.111 { 00:28:20.111 "name": "spare", 00:28:20.111 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:20.111 "is_configured": true, 00:28:20.111 "data_offset": 2048, 00:28:20.111 "data_size": 63488 00:28:20.111 }, 00:28:20.111 { 00:28:20.111 "name": "BaseBdev2", 00:28:20.111 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:20.111 "is_configured": true, 00:28:20.111 "data_offset": 2048, 00:28:20.111 "data_size": 63488 00:28:20.111 } 00:28:20.111 ] 00:28:20.111 }' 00:28:20.111 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:20.111 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:20.111 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:20.111 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:20.111 17:22:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.488 "name": "raid_bdev1", 00:28:21.488 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:21.488 "strip_size_kb": 0, 00:28:21.488 "state": "online", 00:28:21.488 "raid_level": "raid1", 00:28:21.488 "superblock": true, 00:28:21.488 "num_base_bdevs": 2, 00:28:21.488 "num_base_bdevs_discovered": 2, 00:28:21.488 "num_base_bdevs_operational": 2, 00:28:21.488 "process": { 00:28:21.488 "type": "rebuild", 00:28:21.488 "target": "spare", 00:28:21.488 "progress": { 00:28:21.488 "blocks": 57344, 00:28:21.488 "percent": 90 00:28:21.488 } 00:28:21.488 }, 00:28:21.488 "base_bdevs_list": [ 00:28:21.488 { 00:28:21.488 "name": "spare", 00:28:21.488 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:21.488 "is_configured": true, 00:28:21.488 "data_offset": 2048, 00:28:21.488 "data_size": 63488 00:28:21.488 }, 00:28:21.488 { 00:28:21.488 "name": "BaseBdev2", 00:28:21.488 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:21.488 "is_configured": true, 00:28:21.488 "data_offset": 2048, 00:28:21.488 "data_size": 63488 00:28:21.488 } 00:28:21.488 ] 00:28:21.488 }' 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:21.488 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:21.745 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:21.745 17:22:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:21.745 [2024-07-23 17:22:16.985860] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:21.745 [2024-07-23 17:22:16.985919] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:21.745 [2024-07-23 17:22:16.986001] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.679 17:22:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:22.938 "name": "raid_bdev1", 00:28:22.938 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:22.938 "strip_size_kb": 0, 00:28:22.938 "state": "online", 00:28:22.938 "raid_level": "raid1", 00:28:22.938 "superblock": true, 00:28:22.938 "num_base_bdevs": 2, 00:28:22.938 "num_base_bdevs_discovered": 2, 00:28:22.938 "num_base_bdevs_operational": 2, 00:28:22.938 "base_bdevs_list": [ 00:28:22.938 { 00:28:22.938 "name": "spare", 00:28:22.938 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:22.938 "is_configured": true, 00:28:22.938 "data_offset": 2048, 00:28:22.938 "data_size": 63488 00:28:22.938 }, 00:28:22.938 { 00:28:22.938 "name": "BaseBdev2", 00:28:22.938 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:22.938 "is_configured": true, 00:28:22.938 "data_offset": 2048, 00:28:22.938 "data_size": 63488 00:28:22.938 } 00:28:22.938 ] 00:28:22.938 }' 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.938 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:23.196 "name": "raid_bdev1", 00:28:23.196 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:23.196 "strip_size_kb": 0, 00:28:23.196 "state": "online", 00:28:23.196 "raid_level": "raid1", 00:28:23.196 "superblock": true, 00:28:23.196 "num_base_bdevs": 2, 00:28:23.196 "num_base_bdevs_discovered": 2, 00:28:23.196 "num_base_bdevs_operational": 2, 00:28:23.196 "base_bdevs_list": [ 00:28:23.196 { 00:28:23.196 "name": "spare", 00:28:23.196 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:23.196 "is_configured": true, 00:28:23.196 "data_offset": 2048, 00:28:23.196 "data_size": 63488 00:28:23.196 }, 00:28:23.196 { 00:28:23.196 "name": "BaseBdev2", 00:28:23.196 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:23.196 "is_configured": true, 00:28:23.196 "data_offset": 2048, 00:28:23.196 "data_size": 63488 00:28:23.196 } 00:28:23.196 ] 00:28:23.196 }' 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.196 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.454 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.454 "name": "raid_bdev1", 00:28:23.454 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:23.454 "strip_size_kb": 0, 00:28:23.454 "state": "online", 00:28:23.454 "raid_level": "raid1", 00:28:23.454 "superblock": true, 00:28:23.454 "num_base_bdevs": 2, 00:28:23.454 "num_base_bdevs_discovered": 2, 00:28:23.454 "num_base_bdevs_operational": 2, 00:28:23.454 "base_bdevs_list": [ 00:28:23.454 { 00:28:23.454 "name": "spare", 00:28:23.454 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:23.454 "is_configured": true, 00:28:23.454 "data_offset": 2048, 00:28:23.454 "data_size": 63488 00:28:23.454 }, 00:28:23.454 { 00:28:23.454 "name": "BaseBdev2", 00:28:23.454 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:23.454 "is_configured": true, 00:28:23.454 "data_offset": 2048, 00:28:23.454 "data_size": 63488 00:28:23.454 } 00:28:23.454 ] 00:28:23.454 }' 00:28:23.454 17:22:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.454 17:22:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:24.019 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:24.277 [2024-07-23 17:22:19.658148] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:24.277 [2024-07-23 17:22:19.658173] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:24.277 [2024-07-23 17:22:19.658228] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:24.277 [2024-07-23 17:22:19.658286] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:24.277 [2024-07-23 17:22:19.658297] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225aea0 name raid_bdev1, state offline 00:28:24.277 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.277 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:24.596 17:22:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:24.855 /dev/nbd0 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:24.855 1+0 records in 00:28:24.855 1+0 records out 00:28:24.855 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246705 s, 16.6 MB/s 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:24.855 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:25.114 /dev/nbd1 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:25.114 1+0 records in 00:28:25.114 1+0 records out 00:28:25.114 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313509 s, 13.1 MB/s 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:25.114 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:25.373 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:25.631 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:25.631 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:25.631 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:25.631 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.631 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.631 17:22:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:25.631 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:28:25.631 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.632 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:25.632 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:25.890 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:26.148 [2024-07-23 17:22:21.475236] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:26.148 [2024-07-23 17:22:21.475281] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:26.148 [2024-07-23 17:22:21.475302] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225ca10 00:28:26.148 [2024-07-23 17:22:21.475314] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:26.148 [2024-07-23 17:22:21.476946] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:26.148 [2024-07-23 17:22:21.476974] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:26.148 [2024-07-23 17:22:21.477052] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:26.148 [2024-07-23 17:22:21.477078] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:26.148 [2024-07-23 17:22:21.477176] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:26.148 spare 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.148 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.406 [2024-07-23 17:22:21.577489] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21bdce0 00:28:26.406 [2024-07-23 17:22:21.577507] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:26.406 [2024-07-23 17:22:21.577712] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21bebb0 00:28:26.406 [2024-07-23 17:22:21.577869] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21bdce0 00:28:26.406 [2024-07-23 17:22:21.577879] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21bdce0 00:28:26.406 [2024-07-23 17:22:21.577995] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:26.406 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.406 "name": "raid_bdev1", 00:28:26.406 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:26.406 "strip_size_kb": 0, 00:28:26.406 "state": "online", 00:28:26.406 "raid_level": "raid1", 00:28:26.406 "superblock": true, 00:28:26.406 "num_base_bdevs": 2, 00:28:26.406 "num_base_bdevs_discovered": 2, 00:28:26.406 "num_base_bdevs_operational": 2, 00:28:26.406 "base_bdevs_list": [ 00:28:26.406 { 00:28:26.406 "name": "spare", 00:28:26.406 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:26.406 "is_configured": true, 00:28:26.406 "data_offset": 2048, 00:28:26.406 "data_size": 63488 00:28:26.406 }, 00:28:26.406 { 00:28:26.406 "name": "BaseBdev2", 00:28:26.406 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:26.406 "is_configured": true, 00:28:26.406 "data_offset": 2048, 00:28:26.406 "data_size": 63488 00:28:26.406 } 00:28:26.406 ] 00:28:26.406 }' 00:28:26.406 17:22:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.406 17:22:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.342 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.601 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:27.601 "name": "raid_bdev1", 00:28:27.601 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:27.601 "strip_size_kb": 0, 00:28:27.601 "state": "online", 00:28:27.601 "raid_level": "raid1", 00:28:27.601 "superblock": true, 00:28:27.601 "num_base_bdevs": 2, 00:28:27.601 "num_base_bdevs_discovered": 2, 00:28:27.601 "num_base_bdevs_operational": 2, 00:28:27.601 "base_bdevs_list": [ 00:28:27.601 { 00:28:27.601 "name": "spare", 00:28:27.601 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:27.601 "is_configured": true, 00:28:27.601 "data_offset": 2048, 00:28:27.601 "data_size": 63488 00:28:27.601 }, 00:28:27.601 { 00:28:27.601 "name": "BaseBdev2", 00:28:27.601 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:27.601 "is_configured": true, 00:28:27.601 "data_offset": 2048, 00:28:27.601 "data_size": 63488 00:28:27.601 } 00:28:27.601 ] 00:28:27.601 }' 00:28:27.601 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:27.601 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:27.601 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:27.601 17:22:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:27.601 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:27.601 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.859 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:27.859 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:28.118 [2024-07-23 17:22:23.476778] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.118 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:28.685 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.685 "name": "raid_bdev1", 00:28:28.685 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:28.685 "strip_size_kb": 0, 00:28:28.685 "state": "online", 00:28:28.685 "raid_level": "raid1", 00:28:28.685 "superblock": true, 00:28:28.685 "num_base_bdevs": 2, 00:28:28.685 "num_base_bdevs_discovered": 1, 00:28:28.685 "num_base_bdevs_operational": 1, 00:28:28.685 "base_bdevs_list": [ 00:28:28.685 { 00:28:28.685 "name": null, 00:28:28.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.685 "is_configured": false, 00:28:28.685 "data_offset": 2048, 00:28:28.685 "data_size": 63488 00:28:28.685 }, 00:28:28.685 { 00:28:28.685 "name": "BaseBdev2", 00:28:28.685 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:28.685 "is_configured": true, 00:28:28.685 "data_offset": 2048, 00:28:28.685 "data_size": 63488 00:28:28.685 } 00:28:28.685 ] 00:28:28.685 }' 00:28:28.685 17:22:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.685 17:22:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:28.944 17:22:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:29.202 [2024-07-23 17:22:24.563690] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:29.202 [2024-07-23 17:22:24.563835] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:29.202 [2024-07-23 17:22:24.563851] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:29.202 [2024-07-23 17:22:24.563880] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:29.202 [2024-07-23 17:22:24.568586] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21bfbe0 00:28:29.202 [2024-07-23 17:22:24.569913] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:29.202 17:22:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:30.172 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:30.172 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:30.172 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:30.172 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:30.172 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:30.431 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.431 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.431 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:30.431 "name": "raid_bdev1", 00:28:30.431 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:30.431 "strip_size_kb": 0, 00:28:30.431 "state": "online", 00:28:30.431 "raid_level": "raid1", 00:28:30.431 "superblock": true, 00:28:30.431 "num_base_bdevs": 2, 00:28:30.431 "num_base_bdevs_discovered": 2, 00:28:30.431 "num_base_bdevs_operational": 2, 00:28:30.431 "process": { 00:28:30.431 "type": "rebuild", 00:28:30.431 "target": "spare", 00:28:30.431 "progress": { 00:28:30.431 "blocks": 24576, 00:28:30.431 "percent": 38 00:28:30.431 } 00:28:30.431 }, 00:28:30.431 "base_bdevs_list": [ 00:28:30.431 { 00:28:30.431 "name": "spare", 00:28:30.431 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:30.431 "is_configured": true, 00:28:30.431 "data_offset": 2048, 00:28:30.431 "data_size": 63488 00:28:30.431 }, 00:28:30.431 { 00:28:30.431 "name": "BaseBdev2", 00:28:30.431 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:30.431 "is_configured": true, 00:28:30.431 "data_offset": 2048, 00:28:30.431 "data_size": 63488 00:28:30.431 } 00:28:30.431 ] 00:28:30.431 }' 00:28:30.431 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:30.689 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:30.689 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:30.689 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:30.689 17:22:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:30.947 [2024-07-23 17:22:26.169578] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:30.947 [2024-07-23 17:22:26.182480] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:30.947 [2024-07-23 17:22:26.182521] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:30.947 [2024-07-23 17:22:26.182543] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:30.947 [2024-07-23 17:22:26.182551] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.947 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.206 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.206 "name": "raid_bdev1", 00:28:31.206 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:31.206 "strip_size_kb": 0, 00:28:31.206 "state": "online", 00:28:31.206 "raid_level": "raid1", 00:28:31.206 "superblock": true, 00:28:31.206 "num_base_bdevs": 2, 00:28:31.206 "num_base_bdevs_discovered": 1, 00:28:31.206 "num_base_bdevs_operational": 1, 00:28:31.206 "base_bdevs_list": [ 00:28:31.206 { 00:28:31.206 "name": null, 00:28:31.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.206 "is_configured": false, 00:28:31.206 "data_offset": 2048, 00:28:31.206 "data_size": 63488 00:28:31.206 }, 00:28:31.206 { 00:28:31.206 "name": "BaseBdev2", 00:28:31.206 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:31.206 "is_configured": true, 00:28:31.206 "data_offset": 2048, 00:28:31.206 "data_size": 63488 00:28:31.206 } 00:28:31.206 ] 00:28:31.206 }' 00:28:31.206 17:22:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.206 17:22:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:31.773 17:22:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:32.031 [2024-07-23 17:22:27.269695] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:32.031 [2024-07-23 17:22:27.269744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:32.031 [2024-07-23 17:22:27.269765] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x236fbd0 00:28:32.031 [2024-07-23 17:22:27.269778] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:32.031 [2024-07-23 17:22:27.270139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:32.031 [2024-07-23 17:22:27.270156] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:32.031 [2024-07-23 17:22:27.270230] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:32.031 [2024-07-23 17:22:27.270242] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:32.031 [2024-07-23 17:22:27.270252] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:32.031 [2024-07-23 17:22:27.270270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:32.031 [2024-07-23 17:22:27.274994] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21bfbe0 00:28:32.031 spare 00:28:32.031 [2024-07-23 17:22:27.276312] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:32.031 17:22:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.966 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.224 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:33.225 "name": "raid_bdev1", 00:28:33.225 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:33.225 "strip_size_kb": 0, 00:28:33.225 "state": "online", 00:28:33.225 "raid_level": "raid1", 00:28:33.225 "superblock": true, 00:28:33.225 "num_base_bdevs": 2, 00:28:33.225 "num_base_bdevs_discovered": 2, 00:28:33.225 "num_base_bdevs_operational": 2, 00:28:33.225 "process": { 00:28:33.225 "type": "rebuild", 00:28:33.225 "target": "spare", 00:28:33.225 "progress": { 00:28:33.225 "blocks": 24576, 00:28:33.225 "percent": 38 00:28:33.225 } 00:28:33.225 }, 00:28:33.225 "base_bdevs_list": [ 00:28:33.225 { 00:28:33.225 "name": "spare", 00:28:33.225 "uuid": "a62f1370-0cd2-5dac-9190-40dcfe1ce50e", 00:28:33.225 "is_configured": true, 00:28:33.225 "data_offset": 2048, 00:28:33.225 "data_size": 63488 00:28:33.225 }, 00:28:33.225 { 00:28:33.225 "name": "BaseBdev2", 00:28:33.225 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:33.225 "is_configured": true, 00:28:33.225 "data_offset": 2048, 00:28:33.225 "data_size": 63488 00:28:33.225 } 00:28:33.225 ] 00:28:33.225 }' 00:28:33.225 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:33.225 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:33.225 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:33.483 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:33.483 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:33.483 [2024-07-23 17:22:28.875562] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:33.483 [2024-07-23 17:22:28.888888] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:33.483 [2024-07-23 17:22:28.888937] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:33.483 [2024-07-23 17:22:28.888953] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:33.483 [2024-07-23 17:22:28.888961] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.742 17:22:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.309 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:34.309 "name": "raid_bdev1", 00:28:34.309 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:34.309 "strip_size_kb": 0, 00:28:34.309 "state": "online", 00:28:34.309 "raid_level": "raid1", 00:28:34.309 "superblock": true, 00:28:34.309 "num_base_bdevs": 2, 00:28:34.309 "num_base_bdevs_discovered": 1, 00:28:34.309 "num_base_bdevs_operational": 1, 00:28:34.309 "base_bdevs_list": [ 00:28:34.309 { 00:28:34.309 "name": null, 00:28:34.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:34.309 "is_configured": false, 00:28:34.309 "data_offset": 2048, 00:28:34.309 "data_size": 63488 00:28:34.309 }, 00:28:34.309 { 00:28:34.309 "name": "BaseBdev2", 00:28:34.309 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:34.309 "is_configured": true, 00:28:34.309 "data_offset": 2048, 00:28:34.309 "data_size": 63488 00:28:34.309 } 00:28:34.309 ] 00:28:34.309 }' 00:28:34.309 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:34.309 17:22:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.875 17:22:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.875 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:34.875 "name": "raid_bdev1", 00:28:34.875 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:34.875 "strip_size_kb": 0, 00:28:34.875 "state": "online", 00:28:34.875 "raid_level": "raid1", 00:28:34.875 "superblock": true, 00:28:34.875 "num_base_bdevs": 2, 00:28:34.875 "num_base_bdevs_discovered": 1, 00:28:34.875 "num_base_bdevs_operational": 1, 00:28:34.875 "base_bdevs_list": [ 00:28:34.875 { 00:28:34.875 "name": null, 00:28:34.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:34.875 "is_configured": false, 00:28:34.875 "data_offset": 2048, 00:28:34.875 "data_size": 63488 00:28:34.875 }, 00:28:34.875 { 00:28:34.875 "name": "BaseBdev2", 00:28:34.875 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:34.875 "is_configured": true, 00:28:34.875 "data_offset": 2048, 00:28:34.875 "data_size": 63488 00:28:34.875 } 00:28:34.875 ] 00:28:34.875 }' 00:28:34.875 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:34.875 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:34.875 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:35.133 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:35.133 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:35.392 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:35.650 [2024-07-23 17:22:30.854550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:35.650 [2024-07-23 17:22:30.854598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:35.650 [2024-07-23 17:22:30.854618] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23714b0 00:28:35.650 [2024-07-23 17:22:30.854630] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:35.650 [2024-07-23 17:22:30.854957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:35.650 [2024-07-23 17:22:30.854974] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:35.650 [2024-07-23 17:22:30.855033] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:35.650 [2024-07-23 17:22:30.855045] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:35.650 [2024-07-23 17:22:30.855061] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:35.650 BaseBdev1 00:28:35.650 17:22:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.587 17:22:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.846 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.846 "name": "raid_bdev1", 00:28:36.846 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:36.846 "strip_size_kb": 0, 00:28:36.846 "state": "online", 00:28:36.846 "raid_level": "raid1", 00:28:36.846 "superblock": true, 00:28:36.846 "num_base_bdevs": 2, 00:28:36.846 "num_base_bdevs_discovered": 1, 00:28:36.846 "num_base_bdevs_operational": 1, 00:28:36.846 "base_bdevs_list": [ 00:28:36.846 { 00:28:36.846 "name": null, 00:28:36.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:36.846 "is_configured": false, 00:28:36.846 "data_offset": 2048, 00:28:36.846 "data_size": 63488 00:28:36.846 }, 00:28:36.846 { 00:28:36.846 "name": "BaseBdev2", 00:28:36.846 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:36.846 "is_configured": true, 00:28:36.846 "data_offset": 2048, 00:28:36.846 "data_size": 63488 00:28:36.846 } 00:28:36.846 ] 00:28:36.846 }' 00:28:36.846 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.846 17:22:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.414 17:22:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.673 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:37.673 "name": "raid_bdev1", 00:28:37.673 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:37.673 "strip_size_kb": 0, 00:28:37.673 "state": "online", 00:28:37.673 "raid_level": "raid1", 00:28:37.673 "superblock": true, 00:28:37.673 "num_base_bdevs": 2, 00:28:37.673 "num_base_bdevs_discovered": 1, 00:28:37.673 "num_base_bdevs_operational": 1, 00:28:37.673 "base_bdevs_list": [ 00:28:37.673 { 00:28:37.673 "name": null, 00:28:37.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:37.673 "is_configured": false, 00:28:37.673 "data_offset": 2048, 00:28:37.673 "data_size": 63488 00:28:37.673 }, 00:28:37.673 { 00:28:37.673 "name": "BaseBdev2", 00:28:37.673 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:37.673 "is_configured": true, 00:28:37.673 "data_offset": 2048, 00:28:37.673 "data_size": 63488 00:28:37.673 } 00:28:37.673 ] 00:28:37.673 }' 00:28:37.673 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:37.673 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:37.673 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:37.932 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:38.189 [2024-07-23 17:22:33.609882] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:38.189 [2024-07-23 17:22:33.610002] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:38.189 [2024-07-23 17:22:33.610017] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:38.447 request: 00:28:38.447 { 00:28:38.447 "base_bdev": "BaseBdev1", 00:28:38.447 "raid_bdev": "raid_bdev1", 00:28:38.447 "method": "bdev_raid_add_base_bdev", 00:28:38.447 "req_id": 1 00:28:38.447 } 00:28:38.447 Got JSON-RPC error response 00:28:38.447 response: 00:28:38.447 { 00:28:38.447 "code": -22, 00:28:38.447 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:38.447 } 00:28:38.447 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:28:38.447 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:38.447 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:38.447 17:22:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:38.447 17:22:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.443 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:39.702 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:39.702 "name": "raid_bdev1", 00:28:39.702 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:39.702 "strip_size_kb": 0, 00:28:39.702 "state": "online", 00:28:39.702 "raid_level": "raid1", 00:28:39.702 "superblock": true, 00:28:39.702 "num_base_bdevs": 2, 00:28:39.702 "num_base_bdevs_discovered": 1, 00:28:39.702 "num_base_bdevs_operational": 1, 00:28:39.702 "base_bdevs_list": [ 00:28:39.702 { 00:28:39.702 "name": null, 00:28:39.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:39.702 "is_configured": false, 00:28:39.702 "data_offset": 2048, 00:28:39.702 "data_size": 63488 00:28:39.702 }, 00:28:39.702 { 00:28:39.702 "name": "BaseBdev2", 00:28:39.702 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:39.702 "is_configured": true, 00:28:39.702 "data_offset": 2048, 00:28:39.702 "data_size": 63488 00:28:39.702 } 00:28:39.702 ] 00:28:39.702 }' 00:28:39.702 17:22:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:39.702 17:22:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.270 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.530 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.530 "name": "raid_bdev1", 00:28:40.530 "uuid": "70b32f29-4a99-4b38-a510-f87089aafc80", 00:28:40.530 "strip_size_kb": 0, 00:28:40.530 "state": "online", 00:28:40.530 "raid_level": "raid1", 00:28:40.530 "superblock": true, 00:28:40.530 "num_base_bdevs": 2, 00:28:40.530 "num_base_bdevs_discovered": 1, 00:28:40.530 "num_base_bdevs_operational": 1, 00:28:40.530 "base_bdevs_list": [ 00:28:40.530 { 00:28:40.530 "name": null, 00:28:40.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:40.530 "is_configured": false, 00:28:40.530 "data_offset": 2048, 00:28:40.530 "data_size": 63488 00:28:40.531 }, 00:28:40.531 { 00:28:40.531 "name": "BaseBdev2", 00:28:40.531 "uuid": "203a6d4d-9d4b-579d-ba3f-6bf4f812c852", 00:28:40.531 "is_configured": true, 00:28:40.531 "data_offset": 2048, 00:28:40.531 "data_size": 63488 00:28:40.531 } 00:28:40.531 ] 00:28:40.531 }' 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 29611 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 29611 ']' 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 29611 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 29611 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 29611' 00:28:40.531 killing process with pid 29611 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 29611 00:28:40.531 Received shutdown signal, test time was about 60.000000 seconds 00:28:40.531 00:28:40.531 Latency(us) 00:28:40.531 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:40.531 =================================================================================================================== 00:28:40.531 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:40.531 [2024-07-23 17:22:35.887848] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:40.531 [2024-07-23 17:22:35.887949] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:40.531 [2024-07-23 17:22:35.887994] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:40.531 [2024-07-23 17:22:35.888005] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21bdce0 name raid_bdev1, state offline 00:28:40.531 17:22:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 29611 00:28:40.531 [2024-07-23 17:22:35.918741] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:40.791 17:22:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:28:40.791 00:28:40.791 real 0m37.659s 00:28:40.791 user 0m54.470s 00:28:40.791 sys 0m7.163s 00:28:40.791 17:22:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:40.791 17:22:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:28:40.791 ************************************ 00:28:40.791 END TEST raid_rebuild_test_sb 00:28:40.791 ************************************ 00:28:40.791 17:22:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:40.791 17:22:36 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:28:40.791 17:22:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:40.791 17:22:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:40.791 17:22:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:41.051 ************************************ 00:28:41.051 START TEST raid_rebuild_test_io 00:28:41.051 ************************************ 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=34943 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 34943 /var/tmp/spdk-raid.sock 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 34943 ']' 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:41.051 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:41.051 17:22:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:28:41.051 [2024-07-23 17:22:36.294754] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:28:41.051 [2024-07-23 17:22:36.294824] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid34943 ] 00:28:41.051 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:41.051 Zero copy mechanism will not be used. 00:28:41.051 [2024-07-23 17:22:36.425572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:41.311 [2024-07-23 17:22:36.476170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:41.311 [2024-07-23 17:22:36.533654] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:41.311 [2024-07-23 17:22:36.533690] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:41.879 17:22:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:41.879 17:22:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:28:41.879 17:22:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:41.879 17:22:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:42.138 BaseBdev1_malloc 00:28:42.138 17:22:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:42.397 [2024-07-23 17:22:37.710191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:42.397 [2024-07-23 17:22:37.710243] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.397 [2024-07-23 17:22:37.710263] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1652170 00:28:42.397 [2024-07-23 17:22:37.710276] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.397 [2024-07-23 17:22:37.711860] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.397 [2024-07-23 17:22:37.711887] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:42.397 BaseBdev1 00:28:42.397 17:22:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:42.397 17:22:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:28:42.656 BaseBdev2_malloc 00:28:42.656 17:22:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:42.915 [2024-07-23 17:22:38.220460] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:42.915 [2024-07-23 17:22:38.220509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.915 [2024-07-23 17:22:38.220529] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1538680 00:28:42.915 [2024-07-23 17:22:38.220541] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.915 [2024-07-23 17:22:38.221955] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.915 [2024-07-23 17:22:38.221983] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:42.915 BaseBdev2 00:28:42.915 17:22:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:28:43.174 spare_malloc 00:28:43.174 17:22:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:43.434 spare_delay 00:28:43.434 17:22:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:43.693 [2024-07-23 17:22:38.882850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:43.693 [2024-07-23 17:22:38.882902] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:43.694 [2024-07-23 17:22:38.882921] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153c2a0 00:28:43.694 [2024-07-23 17:22:38.882934] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:43.694 [2024-07-23 17:22:38.884367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:43.694 [2024-07-23 17:22:38.884395] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:43.694 spare 00:28:43.694 17:22:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:43.952 [2024-07-23 17:22:39.143600] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:43.952 [2024-07-23 17:22:39.144810] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:43.952 [2024-07-23 17:22:39.144901] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x153bea0 00:28:43.952 [2024-07-23 17:22:39.144913] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:28:43.952 [2024-07-23 17:22:39.145125] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x153f2d0 00:28:43.952 [2024-07-23 17:22:39.145265] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x153bea0 00:28:43.952 [2024-07-23 17:22:39.145275] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x153bea0 00:28:43.952 [2024-07-23 17:22:39.145385] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.952 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.211 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:44.211 "name": "raid_bdev1", 00:28:44.211 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:44.211 "strip_size_kb": 0, 00:28:44.211 "state": "online", 00:28:44.211 "raid_level": "raid1", 00:28:44.211 "superblock": false, 00:28:44.211 "num_base_bdevs": 2, 00:28:44.211 "num_base_bdevs_discovered": 2, 00:28:44.211 "num_base_bdevs_operational": 2, 00:28:44.211 "base_bdevs_list": [ 00:28:44.211 { 00:28:44.211 "name": "BaseBdev1", 00:28:44.211 "uuid": "cd2dafe5-485e-5ecb-9b9e-3f04c7940294", 00:28:44.211 "is_configured": true, 00:28:44.211 "data_offset": 0, 00:28:44.211 "data_size": 65536 00:28:44.211 }, 00:28:44.211 { 00:28:44.211 "name": "BaseBdev2", 00:28:44.211 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:44.211 "is_configured": true, 00:28:44.211 "data_offset": 0, 00:28:44.211 "data_size": 65536 00:28:44.211 } 00:28:44.211 ] 00:28:44.211 }' 00:28:44.211 17:22:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:44.211 17:22:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:28:45.146 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:45.146 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:45.146 [2024-07-23 17:22:40.523512] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:45.146 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:28:45.146 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.146 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:45.403 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:28:45.403 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:28:45.403 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:45.403 17:22:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:28:45.660 [2024-07-23 17:22:40.902290] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x153dc10 00:28:45.660 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:45.660 Zero copy mechanism will not be used. 00:28:45.660 Running I/O for 60 seconds... 00:28:45.660 [2024-07-23 17:22:41.021095] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:45.660 [2024-07-23 17:22:41.021277] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x153dc10 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.660 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.918 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:45.918 "name": "raid_bdev1", 00:28:45.918 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:45.918 "strip_size_kb": 0, 00:28:45.918 "state": "online", 00:28:45.918 "raid_level": "raid1", 00:28:45.918 "superblock": false, 00:28:45.918 "num_base_bdevs": 2, 00:28:45.918 "num_base_bdevs_discovered": 1, 00:28:45.918 "num_base_bdevs_operational": 1, 00:28:45.918 "base_bdevs_list": [ 00:28:45.918 { 00:28:45.918 "name": null, 00:28:45.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:45.919 "is_configured": false, 00:28:45.919 "data_offset": 0, 00:28:45.919 "data_size": 65536 00:28:45.919 }, 00:28:45.919 { 00:28:45.919 "name": "BaseBdev2", 00:28:45.919 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:45.919 "is_configured": true, 00:28:45.919 "data_offset": 0, 00:28:45.919 "data_size": 65536 00:28:45.919 } 00:28:45.919 ] 00:28:45.919 }' 00:28:45.919 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:45.919 17:22:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:28:46.855 17:22:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:46.855 [2024-07-23 17:22:42.173741] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:46.855 17:22:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:46.855 [2024-07-23 17:22:42.248889] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x164d950 00:28:46.855 [2024-07-23 17:22:42.251227] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:47.114 [2024-07-23 17:22:42.378091] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:47.114 [2024-07-23 17:22:42.378479] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:47.373 [2024-07-23 17:22:42.606647] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:47.373 [2024-07-23 17:22:42.606869] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:47.631 [2024-07-23 17:22:42.952907] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:47.631 [2024-07-23 17:22:42.953199] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:47.890 [2024-07-23 17:22:43.181229] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:47.890 [2024-07-23 17:22:43.181404] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.890 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.149 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:48.149 "name": "raid_bdev1", 00:28:48.149 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:48.149 "strip_size_kb": 0, 00:28:48.149 "state": "online", 00:28:48.149 "raid_level": "raid1", 00:28:48.149 "superblock": false, 00:28:48.149 "num_base_bdevs": 2, 00:28:48.149 "num_base_bdevs_discovered": 2, 00:28:48.149 "num_base_bdevs_operational": 2, 00:28:48.149 "process": { 00:28:48.149 "type": "rebuild", 00:28:48.149 "target": "spare", 00:28:48.149 "progress": { 00:28:48.149 "blocks": 12288, 00:28:48.149 "percent": 18 00:28:48.149 } 00:28:48.149 }, 00:28:48.149 "base_bdevs_list": [ 00:28:48.149 { 00:28:48.149 "name": "spare", 00:28:48.149 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:48.149 "is_configured": true, 00:28:48.149 "data_offset": 0, 00:28:48.149 "data_size": 65536 00:28:48.149 }, 00:28:48.149 { 00:28:48.149 "name": "BaseBdev2", 00:28:48.149 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:48.149 "is_configured": true, 00:28:48.149 "data_offset": 0, 00:28:48.149 "data_size": 65536 00:28:48.149 } 00:28:48.149 ] 00:28:48.149 }' 00:28:48.149 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:48.149 [2024-07-23 17:22:43.504702] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:28:48.149 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:48.149 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:48.408 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:48.408 17:22:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:48.408 [2024-07-23 17:22:43.614340] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:28:48.408 [2024-07-23 17:22:43.801159] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:48.667 [2024-07-23 17:22:43.881305] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:28:48.667 [2024-07-23 17:22:43.964285] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:48.667 [2024-07-23 17:22:43.966122] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:48.667 [2024-07-23 17:22:43.966150] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:48.667 [2024-07-23 17:22:43.966161] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:48.667 [2024-07-23 17:22:43.989480] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x153dc10 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.667 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.927 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.927 "name": "raid_bdev1", 00:28:48.927 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:48.927 "strip_size_kb": 0, 00:28:48.927 "state": "online", 00:28:48.927 "raid_level": "raid1", 00:28:48.927 "superblock": false, 00:28:48.927 "num_base_bdevs": 2, 00:28:48.927 "num_base_bdevs_discovered": 1, 00:28:48.927 "num_base_bdevs_operational": 1, 00:28:48.927 "base_bdevs_list": [ 00:28:48.927 { 00:28:48.927 "name": null, 00:28:48.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.927 "is_configured": false, 00:28:48.927 "data_offset": 0, 00:28:48.927 "data_size": 65536 00:28:48.927 }, 00:28:48.927 { 00:28:48.927 "name": "BaseBdev2", 00:28:48.927 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:48.927 "is_configured": true, 00:28:48.927 "data_offset": 0, 00:28:48.927 "data_size": 65536 00:28:48.927 } 00:28:48.927 ] 00:28:48.927 }' 00:28:48.927 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.927 17:22:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.495 17:22:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:49.754 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:49.754 "name": "raid_bdev1", 00:28:49.754 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:49.754 "strip_size_kb": 0, 00:28:49.754 "state": "online", 00:28:49.754 "raid_level": "raid1", 00:28:49.754 "superblock": false, 00:28:49.754 "num_base_bdevs": 2, 00:28:49.754 "num_base_bdevs_discovered": 1, 00:28:49.754 "num_base_bdevs_operational": 1, 00:28:49.754 "base_bdevs_list": [ 00:28:49.754 { 00:28:49.754 "name": null, 00:28:49.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.754 "is_configured": false, 00:28:49.754 "data_offset": 0, 00:28:49.754 "data_size": 65536 00:28:49.754 }, 00:28:49.754 { 00:28:49.754 "name": "BaseBdev2", 00:28:49.754 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:49.754 "is_configured": true, 00:28:49.754 "data_offset": 0, 00:28:49.754 "data_size": 65536 00:28:49.754 } 00:28:49.754 ] 00:28:49.754 }' 00:28:49.754 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:50.013 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:50.013 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:50.013 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:50.013 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:50.272 [2024-07-23 17:22:45.491407] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:50.272 17:22:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:50.272 [2024-07-23 17:22:45.567815] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x164e1a0 00:28:50.272 [2024-07-23 17:22:45.569310] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:50.272 [2024-07-23 17:22:45.687330] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:50.272 [2024-07-23 17:22:45.687834] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:50.531 [2024-07-23 17:22:45.891240] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:50.531 [2024-07-23 17:22:45.891513] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:50.790 [2024-07-23 17:22:46.139046] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:51.049 [2024-07-23 17:22:46.279796] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:51.049 [2024-07-23 17:22:46.280026] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.308 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.308 [2024-07-23 17:22:46.645313] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:28:51.308 [2024-07-23 17:22:46.645764] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:51.567 "name": "raid_bdev1", 00:28:51.567 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:51.567 "strip_size_kb": 0, 00:28:51.567 "state": "online", 00:28:51.567 "raid_level": "raid1", 00:28:51.567 "superblock": false, 00:28:51.567 "num_base_bdevs": 2, 00:28:51.567 "num_base_bdevs_discovered": 2, 00:28:51.567 "num_base_bdevs_operational": 2, 00:28:51.567 "process": { 00:28:51.567 "type": "rebuild", 00:28:51.567 "target": "spare", 00:28:51.567 "progress": { 00:28:51.567 "blocks": 14336, 00:28:51.567 "percent": 21 00:28:51.567 } 00:28:51.567 }, 00:28:51.567 "base_bdevs_list": [ 00:28:51.567 { 00:28:51.567 "name": "spare", 00:28:51.567 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:51.567 "is_configured": true, 00:28:51.567 "data_offset": 0, 00:28:51.567 "data_size": 65536 00:28:51.567 }, 00:28:51.567 { 00:28:51.567 "name": "BaseBdev2", 00:28:51.567 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:51.567 "is_configured": true, 00:28:51.567 "data_offset": 0, 00:28:51.567 "data_size": 65536 00:28:51.567 } 00:28:51.567 ] 00:28:51.567 }' 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:51.567 [2024-07-23 17:22:46.865903] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=866 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.567 17:22:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.826 17:22:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:51.826 "name": "raid_bdev1", 00:28:51.826 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:51.826 "strip_size_kb": 0, 00:28:51.826 "state": "online", 00:28:51.826 "raid_level": "raid1", 00:28:51.826 "superblock": false, 00:28:51.826 "num_base_bdevs": 2, 00:28:51.826 "num_base_bdevs_discovered": 2, 00:28:51.826 "num_base_bdevs_operational": 2, 00:28:51.826 "process": { 00:28:51.826 "type": "rebuild", 00:28:51.826 "target": "spare", 00:28:51.826 "progress": { 00:28:51.826 "blocks": 18432, 00:28:51.826 "percent": 28 00:28:51.826 } 00:28:51.826 }, 00:28:51.826 "base_bdevs_list": [ 00:28:51.826 { 00:28:51.826 "name": "spare", 00:28:51.826 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:51.826 "is_configured": true, 00:28:51.826 "data_offset": 0, 00:28:51.826 "data_size": 65536 00:28:51.826 }, 00:28:51.826 { 00:28:51.826 "name": "BaseBdev2", 00:28:51.826 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:51.826 "is_configured": true, 00:28:51.826 "data_offset": 0, 00:28:51.826 "data_size": 65536 00:28:51.826 } 00:28:51.826 ] 00:28:51.826 }' 00:28:51.826 17:22:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:51.826 17:22:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:51.826 17:22:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:52.085 17:22:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:52.085 17:22:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:52.344 [2024-07-23 17:22:47.660398] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:28:52.344 [2024-07-23 17:22:47.660644] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:28:52.912 [2024-07-23 17:22:48.116136] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.912 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.171 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:53.171 "name": "raid_bdev1", 00:28:53.171 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:53.171 "strip_size_kb": 0, 00:28:53.171 "state": "online", 00:28:53.171 "raid_level": "raid1", 00:28:53.171 "superblock": false, 00:28:53.171 "num_base_bdevs": 2, 00:28:53.171 "num_base_bdevs_discovered": 2, 00:28:53.171 "num_base_bdevs_operational": 2, 00:28:53.171 "process": { 00:28:53.171 "type": "rebuild", 00:28:53.171 "target": "spare", 00:28:53.171 "progress": { 00:28:53.171 "blocks": 38912, 00:28:53.171 "percent": 59 00:28:53.171 } 00:28:53.171 }, 00:28:53.171 "base_bdevs_list": [ 00:28:53.171 { 00:28:53.171 "name": "spare", 00:28:53.171 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:53.171 "is_configured": true, 00:28:53.171 "data_offset": 0, 00:28:53.171 "data_size": 65536 00:28:53.171 }, 00:28:53.171 { 00:28:53.171 "name": "BaseBdev2", 00:28:53.171 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:53.171 "is_configured": true, 00:28:53.171 "data_offset": 0, 00:28:53.171 "data_size": 65536 00:28:53.171 } 00:28:53.171 ] 00:28:53.171 }' 00:28:53.171 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:53.171 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:53.171 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:53.430 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:53.430 17:22:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:53.698 [2024-07-23 17:22:48.901155] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.276 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.536 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:54.536 "name": "raid_bdev1", 00:28:54.536 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:54.536 "strip_size_kb": 0, 00:28:54.536 "state": "online", 00:28:54.536 "raid_level": "raid1", 00:28:54.536 "superblock": false, 00:28:54.536 "num_base_bdevs": 2, 00:28:54.536 "num_base_bdevs_discovered": 2, 00:28:54.536 "num_base_bdevs_operational": 2, 00:28:54.536 "process": { 00:28:54.536 "type": "rebuild", 00:28:54.536 "target": "spare", 00:28:54.536 "progress": { 00:28:54.536 "blocks": 61440, 00:28:54.536 "percent": 93 00:28:54.536 } 00:28:54.536 }, 00:28:54.536 "base_bdevs_list": [ 00:28:54.536 { 00:28:54.536 "name": "spare", 00:28:54.536 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:54.536 "is_configured": true, 00:28:54.536 "data_offset": 0, 00:28:54.536 "data_size": 65536 00:28:54.536 }, 00:28:54.536 { 00:28:54.536 "name": "BaseBdev2", 00:28:54.536 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:54.536 "is_configured": true, 00:28:54.536 "data_offset": 0, 00:28:54.536 "data_size": 65536 00:28:54.536 } 00:28:54.536 ] 00:28:54.536 }' 00:28:54.536 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:54.536 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:54.536 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:54.794 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:54.794 17:22:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:54.795 [2024-07-23 17:22:50.025430] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:54.795 [2024-07-23 17:22:50.125735] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:54.795 [2024-07-23 17:22:50.127619] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.730 17:22:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:55.988 "name": "raid_bdev1", 00:28:55.988 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:55.988 "strip_size_kb": 0, 00:28:55.988 "state": "online", 00:28:55.988 "raid_level": "raid1", 00:28:55.988 "superblock": false, 00:28:55.988 "num_base_bdevs": 2, 00:28:55.988 "num_base_bdevs_discovered": 2, 00:28:55.988 "num_base_bdevs_operational": 2, 00:28:55.988 "base_bdevs_list": [ 00:28:55.988 { 00:28:55.988 "name": "spare", 00:28:55.988 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:55.988 "is_configured": true, 00:28:55.988 "data_offset": 0, 00:28:55.988 "data_size": 65536 00:28:55.988 }, 00:28:55.988 { 00:28:55.988 "name": "BaseBdev2", 00:28:55.988 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:55.988 "is_configured": true, 00:28:55.988 "data_offset": 0, 00:28:55.988 "data_size": 65536 00:28:55.988 } 00:28:55.988 ] 00:28:55.988 }' 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.988 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:56.247 "name": "raid_bdev1", 00:28:56.247 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:56.247 "strip_size_kb": 0, 00:28:56.247 "state": "online", 00:28:56.247 "raid_level": "raid1", 00:28:56.247 "superblock": false, 00:28:56.247 "num_base_bdevs": 2, 00:28:56.247 "num_base_bdevs_discovered": 2, 00:28:56.247 "num_base_bdevs_operational": 2, 00:28:56.247 "base_bdevs_list": [ 00:28:56.247 { 00:28:56.247 "name": "spare", 00:28:56.247 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:56.247 "is_configured": true, 00:28:56.247 "data_offset": 0, 00:28:56.247 "data_size": 65536 00:28:56.247 }, 00:28:56.247 { 00:28:56.247 "name": "BaseBdev2", 00:28:56.247 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:56.247 "is_configured": true, 00:28:56.247 "data_offset": 0, 00:28:56.247 "data_size": 65536 00:28:56.247 } 00:28:56.247 ] 00:28:56.247 }' 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.247 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.248 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.248 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.248 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.507 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.507 "name": "raid_bdev1", 00:28:56.507 "uuid": "d81c8073-3af2-457e-8baf-456426cdf725", 00:28:56.507 "strip_size_kb": 0, 00:28:56.507 "state": "online", 00:28:56.507 "raid_level": "raid1", 00:28:56.507 "superblock": false, 00:28:56.507 "num_base_bdevs": 2, 00:28:56.507 "num_base_bdevs_discovered": 2, 00:28:56.507 "num_base_bdevs_operational": 2, 00:28:56.507 "base_bdevs_list": [ 00:28:56.507 { 00:28:56.507 "name": "spare", 00:28:56.507 "uuid": "c7f73da9-bd21-5d44-9c6e-c2ba24bb08ee", 00:28:56.507 "is_configured": true, 00:28:56.507 "data_offset": 0, 00:28:56.507 "data_size": 65536 00:28:56.507 }, 00:28:56.507 { 00:28:56.507 "name": "BaseBdev2", 00:28:56.507 "uuid": "0b68fa03-71c6-568e-8c5d-368354e9ce48", 00:28:56.507 "is_configured": true, 00:28:56.507 "data_offset": 0, 00:28:56.507 "data_size": 65536 00:28:56.507 } 00:28:56.507 ] 00:28:56.507 }' 00:28:56.507 17:22:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.507 17:22:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:28:57.075 17:22:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:57.334 [2024-07-23 17:22:52.719807] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:57.334 [2024-07-23 17:22:52.719839] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:57.593 00:28:57.593 Latency(us) 00:28:57.593 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:57.593 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:28:57.593 raid_bdev1 : 11.83 94.67 284.02 0.00 0.00 14691.11 292.06 117622.87 00:28:57.593 =================================================================================================================== 00:28:57.593 Total : 94.67 284.02 0.00 0.00 14691.11 292.06 117622.87 00:28:57.593 [2024-07-23 17:22:52.767825] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:57.593 [2024-07-23 17:22:52.767853] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:57.593 [2024-07-23 17:22:52.767937] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:57.593 [2024-07-23 17:22:52.767950] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x153bea0 name raid_bdev1, state offline 00:28:57.593 0 00:28:57.593 17:22:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.593 17:22:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:57.853 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:28:58.112 /dev/nbd0 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:58.112 1+0 records in 00:28:58.112 1+0 records out 00:28:58.112 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299392 s, 13.7 MB/s 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:58.112 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:28:58.375 /dev/nbd1 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:58.375 1+0 records in 00:28:58.375 1+0 records out 00:28:58.375 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274739 s, 14.9 MB/s 00:28:58.375 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:58.376 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:58.634 17:22:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 34943 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 34943 ']' 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 34943 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 34943 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 34943' 00:28:58.893 killing process with pid 34943 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 34943 00:28:58.893 Received shutdown signal, test time was about 13.370319 seconds 00:28:58.893 00:28:58.893 Latency(us) 00:28:58.893 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:58.893 =================================================================================================================== 00:28:58.893 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:58.893 [2024-07-23 17:22:54.307582] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:58.893 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 34943 00:28:59.152 [2024-07-23 17:22:54.329048] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:59.152 17:22:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:28:59.152 00:28:59.152 real 0m18.319s 00:28:59.152 user 0m27.966s 00:28:59.152 sys 0m2.964s 00:28:59.152 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:59.152 17:22:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:28:59.152 ************************************ 00:28:59.152 END TEST raid_rebuild_test_io 00:28:59.152 ************************************ 00:28:59.411 17:22:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:59.411 17:22:54 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:28:59.411 17:22:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:59.411 17:22:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:59.411 17:22:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:59.412 ************************************ 00:28:59.412 START TEST raid_rebuild_test_sb_io 00:28:59.412 ************************************ 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=37461 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 37461 /var/tmp/spdk-raid.sock 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 37461 ']' 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:59.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:59.412 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:59.412 [2024-07-23 17:22:54.701129] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:28:59.412 [2024-07-23 17:22:54.701183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid37461 ] 00:28:59.412 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:59.412 Zero copy mechanism will not be used. 00:28:59.412 [2024-07-23 17:22:54.815119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:59.671 [2024-07-23 17:22:54.868363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:59.671 [2024-07-23 17:22:54.927951] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:59.671 [2024-07-23 17:22:54.927990] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:59.671 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:59.671 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:28:59.671 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:59.671 17:22:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:28:59.930 BaseBdev1_malloc 00:28:59.930 17:22:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:00.188 [2024-07-23 17:22:55.467789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:00.188 [2024-07-23 17:22:55.467838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:00.188 [2024-07-23 17:22:55.467859] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xce8170 00:29:00.188 [2024-07-23 17:22:55.467872] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:00.188 [2024-07-23 17:22:55.469336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:00.188 [2024-07-23 17:22:55.469365] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:00.188 BaseBdev1 00:29:00.188 17:22:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:00.188 17:22:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:29:00.447 BaseBdev2_malloc 00:29:00.447 17:22:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:00.706 [2024-07-23 17:22:55.965766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:00.706 [2024-07-23 17:22:55.965810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:00.706 [2024-07-23 17:22:55.965828] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbce680 00:29:00.707 [2024-07-23 17:22:55.965840] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:00.707 [2024-07-23 17:22:55.967196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:00.707 [2024-07-23 17:22:55.967223] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:00.707 BaseBdev2 00:29:00.707 17:22:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:29:00.966 spare_malloc 00:29:00.966 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:01.226 spare_delay 00:29:01.226 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:01.486 [2024-07-23 17:22:56.708126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:01.486 [2024-07-23 17:22:56.708171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:01.486 [2024-07-23 17:22:56.708190] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd22a0 00:29:01.486 [2024-07-23 17:22:56.708203] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:01.486 [2024-07-23 17:22:56.709677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:01.486 [2024-07-23 17:22:56.709704] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:01.486 spare 00:29:01.486 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:01.745 [2024-07-23 17:22:56.948790] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:01.745 [2024-07-23 17:22:56.949922] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:01.745 [2024-07-23 17:22:56.950072] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xbd1ea0 00:29:01.745 [2024-07-23 17:22:56.950086] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:01.745 [2024-07-23 17:22:56.950267] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbd4010 00:29:01.745 [2024-07-23 17:22:56.950400] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbd1ea0 00:29:01.745 [2024-07-23 17:22:56.950410] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbd1ea0 00:29:01.745 [2024-07-23 17:22:56.950500] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.745 17:22:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.005 17:22:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:02.005 "name": "raid_bdev1", 00:29:02.005 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:02.005 "strip_size_kb": 0, 00:29:02.005 "state": "online", 00:29:02.005 "raid_level": "raid1", 00:29:02.005 "superblock": true, 00:29:02.005 "num_base_bdevs": 2, 00:29:02.005 "num_base_bdevs_discovered": 2, 00:29:02.005 "num_base_bdevs_operational": 2, 00:29:02.005 "base_bdevs_list": [ 00:29:02.005 { 00:29:02.005 "name": "BaseBdev1", 00:29:02.005 "uuid": "ffb3488e-5fae-5e13-ae4d-29ca9ce36f1d", 00:29:02.005 "is_configured": true, 00:29:02.005 "data_offset": 2048, 00:29:02.005 "data_size": 63488 00:29:02.005 }, 00:29:02.005 { 00:29:02.005 "name": "BaseBdev2", 00:29:02.005 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:02.005 "is_configured": true, 00:29:02.005 "data_offset": 2048, 00:29:02.005 "data_size": 63488 00:29:02.005 } 00:29:02.005 ] 00:29:02.005 }' 00:29:02.005 17:22:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:02.005 17:22:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:02.574 17:22:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:02.574 17:22:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:02.833 [2024-07-23 17:22:58.043927] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:02.833 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:29:02.833 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.833 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:03.093 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:29:03.093 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:29:03.093 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:03.093 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:29:03.093 [2024-07-23 17:22:58.426729] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb377b0 00:29:03.093 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:03.093 Zero copy mechanism will not be used. 00:29:03.093 Running I/O for 60 seconds... 00:29:03.353 [2024-07-23 17:22:58.534059] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:03.353 [2024-07-23 17:22:58.542232] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb377b0 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.353 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.612 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:03.612 "name": "raid_bdev1", 00:29:03.612 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:03.612 "strip_size_kb": 0, 00:29:03.612 "state": "online", 00:29:03.612 "raid_level": "raid1", 00:29:03.612 "superblock": true, 00:29:03.612 "num_base_bdevs": 2, 00:29:03.612 "num_base_bdevs_discovered": 1, 00:29:03.612 "num_base_bdevs_operational": 1, 00:29:03.612 "base_bdevs_list": [ 00:29:03.612 { 00:29:03.612 "name": null, 00:29:03.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:03.612 "is_configured": false, 00:29:03.612 "data_offset": 2048, 00:29:03.612 "data_size": 63488 00:29:03.612 }, 00:29:03.612 { 00:29:03.612 "name": "BaseBdev2", 00:29:03.612 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:03.612 "is_configured": true, 00:29:03.612 "data_offset": 2048, 00:29:03.612 "data_size": 63488 00:29:03.612 } 00:29:03.612 ] 00:29:03.612 }' 00:29:03.612 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:03.612 17:22:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:04.181 17:22:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:04.440 [2024-07-23 17:22:59.710154] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:04.440 [2024-07-23 17:22:59.761048] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb48810 00:29:04.440 17:22:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:04.440 [2024-07-23 17:22:59.763383] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:04.700 [2024-07-23 17:22:59.888866] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:29:04.700 [2024-07-23 17:22:59.889319] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:29:04.700 [2024-07-23 17:23:00.118385] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:29:04.700 [2024-07-23 17:23:00.118655] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:29:04.959 [2024-07-23 17:23:00.366358] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:29:05.217 [2024-07-23 17:23:00.468118] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:29:05.217 [2024-07-23 17:23:00.468316] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:29:05.477 [2024-07-23 17:23:00.698932] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:29:05.477 [2024-07-23 17:23:00.699255] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.477 17:23:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:05.736 [2024-07-23 17:23:00.944967] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:29:05.737 [2024-07-23 17:23:00.953213] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:29:05.737 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:05.737 "name": "raid_bdev1", 00:29:05.737 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:05.737 "strip_size_kb": 0, 00:29:05.737 "state": "online", 00:29:05.737 "raid_level": "raid1", 00:29:05.737 "superblock": true, 00:29:05.737 "num_base_bdevs": 2, 00:29:05.737 "num_base_bdevs_discovered": 2, 00:29:05.737 "num_base_bdevs_operational": 2, 00:29:05.737 "process": { 00:29:05.737 "type": "rebuild", 00:29:05.737 "target": "spare", 00:29:05.737 "progress": { 00:29:05.737 "blocks": 16384, 00:29:05.737 "percent": 25 00:29:05.737 } 00:29:05.737 }, 00:29:05.737 "base_bdevs_list": [ 00:29:05.737 { 00:29:05.737 "name": "spare", 00:29:05.737 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:05.737 "is_configured": true, 00:29:05.737 "data_offset": 2048, 00:29:05.737 "data_size": 63488 00:29:05.737 }, 00:29:05.737 { 00:29:05.737 "name": "BaseBdev2", 00:29:05.737 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:05.737 "is_configured": true, 00:29:05.737 "data_offset": 2048, 00:29:05.737 "data_size": 63488 00:29:05.737 } 00:29:05.737 ] 00:29:05.737 }' 00:29:05.737 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:05.737 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:05.737 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:05.737 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:05.737 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:05.996 [2024-07-23 17:23:01.301254] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:29:05.996 [2024-07-23 17:23:01.301724] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:29:05.996 [2024-07-23 17:23:01.338803] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.256 [2024-07-23 17:23:01.539178] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:06.256 [2024-07-23 17:23:01.549072] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.256 [2024-07-23 17:23:01.549100] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.256 [2024-07-23 17:23:01.549110] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:06.256 [2024-07-23 17:23:01.588075] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb377b0 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.256 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.257 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.516 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.516 "name": "raid_bdev1", 00:29:06.516 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:06.516 "strip_size_kb": 0, 00:29:06.516 "state": "online", 00:29:06.516 "raid_level": "raid1", 00:29:06.516 "superblock": true, 00:29:06.516 "num_base_bdevs": 2, 00:29:06.516 "num_base_bdevs_discovered": 1, 00:29:06.516 "num_base_bdevs_operational": 1, 00:29:06.516 "base_bdevs_list": [ 00:29:06.516 { 00:29:06.516 "name": null, 00:29:06.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:06.516 "is_configured": false, 00:29:06.516 "data_offset": 2048, 00:29:06.516 "data_size": 63488 00:29:06.516 }, 00:29:06.516 { 00:29:06.516 "name": "BaseBdev2", 00:29:06.516 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:06.516 "is_configured": true, 00:29:06.516 "data_offset": 2048, 00:29:06.516 "data_size": 63488 00:29:06.516 } 00:29:06.516 ] 00:29:06.516 }' 00:29:06.516 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.516 17:23:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:07.085 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:07.085 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:07.085 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:07.085 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:07.085 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:07.345 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.345 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.607 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.607 "name": "raid_bdev1", 00:29:07.607 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:07.607 "strip_size_kb": 0, 00:29:07.607 "state": "online", 00:29:07.607 "raid_level": "raid1", 00:29:07.607 "superblock": true, 00:29:07.607 "num_base_bdevs": 2, 00:29:07.607 "num_base_bdevs_discovered": 1, 00:29:07.607 "num_base_bdevs_operational": 1, 00:29:07.607 "base_bdevs_list": [ 00:29:07.607 { 00:29:07.607 "name": null, 00:29:07.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.607 "is_configured": false, 00:29:07.607 "data_offset": 2048, 00:29:07.607 "data_size": 63488 00:29:07.607 }, 00:29:07.607 { 00:29:07.607 "name": "BaseBdev2", 00:29:07.607 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:07.607 "is_configured": true, 00:29:07.607 "data_offset": 2048, 00:29:07.607 "data_size": 63488 00:29:07.607 } 00:29:07.607 ] 00:29:07.607 }' 00:29:07.607 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:07.607 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:07.607 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:07.607 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:07.607 17:23:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:07.907 [2024-07-23 17:23:03.079982] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:07.907 17:23:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:07.907 [2024-07-23 17:23:03.139841] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb49320 00:29:07.907 [2024-07-23 17:23:03.141330] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:07.907 [2024-07-23 17:23:03.259489] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:29:07.907 [2024-07-23 17:23:03.259907] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:29:08.165 [2024-07-23 17:23:03.463395] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:29:08.165 [2024-07-23 17:23:03.463552] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:29:08.424 [2024-07-23 17:23:03.828376] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:29:08.424 [2024-07-23 17:23:03.828767] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:29:08.683 [2024-07-23 17:23:04.056279] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:29:08.683 [2024-07-23 17:23:04.056560] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.941 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:08.941 [2024-07-23 17:23:04.320244] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:29:08.941 [2024-07-23 17:23:04.320636] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:09.200 "name": "raid_bdev1", 00:29:09.200 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:09.200 "strip_size_kb": 0, 00:29:09.200 "state": "online", 00:29:09.200 "raid_level": "raid1", 00:29:09.200 "superblock": true, 00:29:09.200 "num_base_bdevs": 2, 00:29:09.200 "num_base_bdevs_discovered": 2, 00:29:09.200 "num_base_bdevs_operational": 2, 00:29:09.200 "process": { 00:29:09.200 "type": "rebuild", 00:29:09.200 "target": "spare", 00:29:09.200 "progress": { 00:29:09.200 "blocks": 14336, 00:29:09.200 "percent": 22 00:29:09.200 } 00:29:09.200 }, 00:29:09.200 "base_bdevs_list": [ 00:29:09.200 { 00:29:09.200 "name": "spare", 00:29:09.200 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:09.200 "is_configured": true, 00:29:09.200 "data_offset": 2048, 00:29:09.200 "data_size": 63488 00:29:09.200 }, 00:29:09.200 { 00:29:09.200 "name": "BaseBdev2", 00:29:09.200 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:09.200 "is_configured": true, 00:29:09.200 "data_offset": 2048, 00:29:09.200 "data_size": 63488 00:29:09.200 } 00:29:09.200 ] 00:29:09.200 }' 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:09.200 [2024-07-23 17:23:04.440077] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:29:09.200 [2024-07-23 17:23:04.440292] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:09.200 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=884 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.200 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.459 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:09.459 "name": "raid_bdev1", 00:29:09.459 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:09.459 "strip_size_kb": 0, 00:29:09.459 "state": "online", 00:29:09.459 "raid_level": "raid1", 00:29:09.459 "superblock": true, 00:29:09.459 "num_base_bdevs": 2, 00:29:09.459 "num_base_bdevs_discovered": 2, 00:29:09.459 "num_base_bdevs_operational": 2, 00:29:09.459 "process": { 00:29:09.459 "type": "rebuild", 00:29:09.459 "target": "spare", 00:29:09.459 "progress": { 00:29:09.459 "blocks": 20480, 00:29:09.459 "percent": 32 00:29:09.459 } 00:29:09.459 }, 00:29:09.459 "base_bdevs_list": [ 00:29:09.459 { 00:29:09.459 "name": "spare", 00:29:09.459 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:09.459 "is_configured": true, 00:29:09.459 "data_offset": 2048, 00:29:09.459 "data_size": 63488 00:29:09.459 }, 00:29:09.459 { 00:29:09.459 "name": "BaseBdev2", 00:29:09.459 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:09.459 "is_configured": true, 00:29:09.459 "data_offset": 2048, 00:29:09.459 "data_size": 63488 00:29:09.459 } 00:29:09.459 ] 00:29:09.459 }' 00:29:09.459 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:09.459 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:09.459 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:09.459 [2024-07-23 17:23:04.779096] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:29:09.459 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:09.459 17:23:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:10.396 [2024-07-23 17:23:05.592466] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.396 17:23:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.655 17:23:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:10.655 "name": "raid_bdev1", 00:29:10.655 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:10.655 "strip_size_kb": 0, 00:29:10.655 "state": "online", 00:29:10.655 "raid_level": "raid1", 00:29:10.655 "superblock": true, 00:29:10.655 "num_base_bdevs": 2, 00:29:10.655 "num_base_bdevs_discovered": 2, 00:29:10.655 "num_base_bdevs_operational": 2, 00:29:10.655 "process": { 00:29:10.655 "type": "rebuild", 00:29:10.655 "target": "spare", 00:29:10.655 "progress": { 00:29:10.655 "blocks": 40960, 00:29:10.655 "percent": 64 00:29:10.655 } 00:29:10.655 }, 00:29:10.655 "base_bdevs_list": [ 00:29:10.655 { 00:29:10.655 "name": "spare", 00:29:10.655 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:10.655 "is_configured": true, 00:29:10.655 "data_offset": 2048, 00:29:10.655 "data_size": 63488 00:29:10.655 }, 00:29:10.655 { 00:29:10.655 "name": "BaseBdev2", 00:29:10.655 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:10.655 "is_configured": true, 00:29:10.655 "data_offset": 2048, 00:29:10.655 "data_size": 63488 00:29:10.655 } 00:29:10.655 ] 00:29:10.655 }' 00:29:10.655 17:23:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:10.914 17:23:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:10.914 17:23:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:10.914 17:23:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:10.914 17:23:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:10.914 [2024-07-23 17:23:06.176752] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:29:11.482 [2024-07-23 17:23:06.746280] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:29:11.740 [2024-07-23 17:23:06.985541] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:29:11.740 [2024-07-23 17:23:07.086926] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:29:11.740 [2024-07-23 17:23:07.087094] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.740 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.999 [2024-07-23 17:23:07.328366] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:11.999 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:11.999 "name": "raid_bdev1", 00:29:11.999 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:11.999 "strip_size_kb": 0, 00:29:11.999 "state": "online", 00:29:11.999 "raid_level": "raid1", 00:29:11.999 "superblock": true, 00:29:11.999 "num_base_bdevs": 2, 00:29:11.999 "num_base_bdevs_discovered": 2, 00:29:11.999 "num_base_bdevs_operational": 2, 00:29:11.999 "process": { 00:29:11.999 "type": "rebuild", 00:29:11.999 "target": "spare", 00:29:11.999 "progress": { 00:29:11.999 "blocks": 63488, 00:29:11.999 "percent": 100 00:29:11.999 } 00:29:11.999 }, 00:29:11.999 "base_bdevs_list": [ 00:29:11.999 { 00:29:11.999 "name": "spare", 00:29:11.999 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:11.999 "is_configured": true, 00:29:11.999 "data_offset": 2048, 00:29:11.999 "data_size": 63488 00:29:11.999 }, 00:29:11.999 { 00:29:11.999 "name": "BaseBdev2", 00:29:11.999 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:11.999 "is_configured": true, 00:29:11.999 "data_offset": 2048, 00:29:11.999 "data_size": 63488 00:29:11.999 } 00:29:11.999 ] 00:29:11.999 }' 00:29:11.999 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:12.258 [2024-07-23 17:23:07.436650] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:12.258 [2024-07-23 17:23:07.439106] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:12.258 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:12.258 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:12.258 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:12.258 17:23:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.193 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:13.452 "name": "raid_bdev1", 00:29:13.452 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:13.452 "strip_size_kb": 0, 00:29:13.452 "state": "online", 00:29:13.452 "raid_level": "raid1", 00:29:13.452 "superblock": true, 00:29:13.452 "num_base_bdevs": 2, 00:29:13.452 "num_base_bdevs_discovered": 2, 00:29:13.452 "num_base_bdevs_operational": 2, 00:29:13.452 "base_bdevs_list": [ 00:29:13.452 { 00:29:13.452 "name": "spare", 00:29:13.452 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:13.452 "is_configured": true, 00:29:13.452 "data_offset": 2048, 00:29:13.452 "data_size": 63488 00:29:13.452 }, 00:29:13.452 { 00:29:13.452 "name": "BaseBdev2", 00:29:13.452 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:13.452 "is_configured": true, 00:29:13.452 "data_offset": 2048, 00:29:13.452 "data_size": 63488 00:29:13.452 } 00:29:13.452 ] 00:29:13.452 }' 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.452 17:23:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.710 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:13.710 "name": "raid_bdev1", 00:29:13.710 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:13.710 "strip_size_kb": 0, 00:29:13.710 "state": "online", 00:29:13.710 "raid_level": "raid1", 00:29:13.710 "superblock": true, 00:29:13.710 "num_base_bdevs": 2, 00:29:13.710 "num_base_bdevs_discovered": 2, 00:29:13.710 "num_base_bdevs_operational": 2, 00:29:13.710 "base_bdevs_list": [ 00:29:13.710 { 00:29:13.710 "name": "spare", 00:29:13.710 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:13.710 "is_configured": true, 00:29:13.710 "data_offset": 2048, 00:29:13.710 "data_size": 63488 00:29:13.710 }, 00:29:13.710 { 00:29:13.710 "name": "BaseBdev2", 00:29:13.710 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:13.710 "is_configured": true, 00:29:13.710 "data_offset": 2048, 00:29:13.710 "data_size": 63488 00:29:13.710 } 00:29:13.710 ] 00:29:13.710 }' 00:29:13.711 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.970 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:14.229 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:14.229 "name": "raid_bdev1", 00:29:14.229 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:14.229 "strip_size_kb": 0, 00:29:14.229 "state": "online", 00:29:14.229 "raid_level": "raid1", 00:29:14.229 "superblock": true, 00:29:14.229 "num_base_bdevs": 2, 00:29:14.229 "num_base_bdevs_discovered": 2, 00:29:14.229 "num_base_bdevs_operational": 2, 00:29:14.229 "base_bdevs_list": [ 00:29:14.229 { 00:29:14.229 "name": "spare", 00:29:14.229 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:14.229 "is_configured": true, 00:29:14.229 "data_offset": 2048, 00:29:14.229 "data_size": 63488 00:29:14.229 }, 00:29:14.229 { 00:29:14.229 "name": "BaseBdev2", 00:29:14.229 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:14.229 "is_configured": true, 00:29:14.229 "data_offset": 2048, 00:29:14.229 "data_size": 63488 00:29:14.229 } 00:29:14.229 ] 00:29:14.229 }' 00:29:14.229 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:14.229 17:23:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:14.796 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:15.054 [2024-07-23 17:23:10.273122] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:15.054 [2024-07-23 17:23:10.273157] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:15.054 00:29:15.055 Latency(us) 00:29:15.055 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:15.055 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:29:15.055 raid_bdev1 : 11.84 88.44 265.31 0.00 0.00 15070.53 281.38 118534.68 00:29:15.055 =================================================================================================================== 00:29:15.055 Total : 88.44 265.31 0.00 0.00 15070.53 281.38 118534.68 00:29:15.055 [2024-07-23 17:23:10.301161] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:15.055 [2024-07-23 17:23:10.301189] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:15.055 [2024-07-23 17:23:10.301262] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:15.055 [2024-07-23 17:23:10.301274] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd1ea0 name raid_bdev1, state offline 00:29:15.055 0 00:29:15.055 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.055 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:15.313 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:29:15.314 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:15.314 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:15.314 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:29:15.572 /dev/nbd0 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:15.572 1+0 records in 00:29:15.572 1+0 records out 00:29:15.572 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284138 s, 14.4 MB/s 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:15.572 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:29:15.573 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:15.573 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:29:15.573 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:15.573 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:15.573 17:23:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:29:15.831 /dev/nbd1 00:29:15.831 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:15.832 1+0 records in 00:29:15.832 1+0 records out 00:29:15.832 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000198333 s, 20.7 MB/s 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:15.832 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:16.091 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:16.349 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:16.349 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:16.608 17:23:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:16.608 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:16.867 [2024-07-23 17:23:12.250970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:16.867 [2024-07-23 17:23:12.251015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:16.867 [2024-07-23 17:23:12.251034] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd3310 00:29:16.867 [2024-07-23 17:23:12.251046] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:16.867 [2024-07-23 17:23:12.252695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:16.867 [2024-07-23 17:23:12.252724] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:16.867 [2024-07-23 17:23:12.252807] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:16.867 [2024-07-23 17:23:12.252835] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:16.867 [2024-07-23 17:23:12.252953] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:16.867 spare 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.867 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.125 [2024-07-23 17:23:12.353270] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xbd4a80 00:29:17.125 [2024-07-23 17:23:12.353287] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:17.125 [2024-07-23 17:23:12.353473] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbd3dd0 00:29:17.125 [2024-07-23 17:23:12.353616] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbd4a80 00:29:17.125 [2024-07-23 17:23:12.353626] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbd4a80 00:29:17.125 [2024-07-23 17:23:12.353734] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:17.125 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.125 "name": "raid_bdev1", 00:29:17.125 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:17.125 "strip_size_kb": 0, 00:29:17.125 "state": "online", 00:29:17.125 "raid_level": "raid1", 00:29:17.125 "superblock": true, 00:29:17.125 "num_base_bdevs": 2, 00:29:17.126 "num_base_bdevs_discovered": 2, 00:29:17.126 "num_base_bdevs_operational": 2, 00:29:17.126 "base_bdevs_list": [ 00:29:17.126 { 00:29:17.126 "name": "spare", 00:29:17.126 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:17.126 "is_configured": true, 00:29:17.126 "data_offset": 2048, 00:29:17.126 "data_size": 63488 00:29:17.126 }, 00:29:17.126 { 00:29:17.126 "name": "BaseBdev2", 00:29:17.126 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:17.126 "is_configured": true, 00:29:17.126 "data_offset": 2048, 00:29:17.126 "data_size": 63488 00:29:17.126 } 00:29:17.126 ] 00:29:17.126 }' 00:29:17.126 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.126 17:23:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:18.062 "name": "raid_bdev1", 00:29:18.062 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:18.062 "strip_size_kb": 0, 00:29:18.062 "state": "online", 00:29:18.062 "raid_level": "raid1", 00:29:18.062 "superblock": true, 00:29:18.062 "num_base_bdevs": 2, 00:29:18.062 "num_base_bdevs_discovered": 2, 00:29:18.062 "num_base_bdevs_operational": 2, 00:29:18.062 "base_bdevs_list": [ 00:29:18.062 { 00:29:18.062 "name": "spare", 00:29:18.062 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:18.062 "is_configured": true, 00:29:18.062 "data_offset": 2048, 00:29:18.062 "data_size": 63488 00:29:18.062 }, 00:29:18.062 { 00:29:18.062 "name": "BaseBdev2", 00:29:18.062 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:18.062 "is_configured": true, 00:29:18.062 "data_offset": 2048, 00:29:18.062 "data_size": 63488 00:29:18.062 } 00:29:18.062 ] 00:29:18.062 }' 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:18.062 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:18.320 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:18.320 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:18.320 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.578 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:18.578 17:23:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:18.578 [2024-07-23 17:23:13.995947] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.838 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.097 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:19.097 "name": "raid_bdev1", 00:29:19.097 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:19.097 "strip_size_kb": 0, 00:29:19.097 "state": "online", 00:29:19.097 "raid_level": "raid1", 00:29:19.097 "superblock": true, 00:29:19.097 "num_base_bdevs": 2, 00:29:19.097 "num_base_bdevs_discovered": 1, 00:29:19.097 "num_base_bdevs_operational": 1, 00:29:19.097 "base_bdevs_list": [ 00:29:19.097 { 00:29:19.097 "name": null, 00:29:19.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.097 "is_configured": false, 00:29:19.097 "data_offset": 2048, 00:29:19.097 "data_size": 63488 00:29:19.097 }, 00:29:19.097 { 00:29:19.097 "name": "BaseBdev2", 00:29:19.097 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:19.097 "is_configured": true, 00:29:19.097 "data_offset": 2048, 00:29:19.097 "data_size": 63488 00:29:19.097 } 00:29:19.097 ] 00:29:19.097 }' 00:29:19.097 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:19.097 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:19.663 17:23:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:19.922 [2024-07-23 17:23:15.098995] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:19.922 [2024-07-23 17:23:15.099141] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:19.922 [2024-07-23 17:23:15.099157] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:19.922 [2024-07-23 17:23:15.099185] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:19.922 [2024-07-23 17:23:15.104383] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbd3dd0 00:29:19.922 [2024-07-23 17:23:15.106492] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:19.922 17:23:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.857 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.116 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:21.116 "name": "raid_bdev1", 00:29:21.116 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:21.116 "strip_size_kb": 0, 00:29:21.116 "state": "online", 00:29:21.116 "raid_level": "raid1", 00:29:21.116 "superblock": true, 00:29:21.116 "num_base_bdevs": 2, 00:29:21.116 "num_base_bdevs_discovered": 2, 00:29:21.116 "num_base_bdevs_operational": 2, 00:29:21.116 "process": { 00:29:21.116 "type": "rebuild", 00:29:21.116 "target": "spare", 00:29:21.116 "progress": { 00:29:21.116 "blocks": 24576, 00:29:21.116 "percent": 38 00:29:21.116 } 00:29:21.116 }, 00:29:21.116 "base_bdevs_list": [ 00:29:21.116 { 00:29:21.116 "name": "spare", 00:29:21.116 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:21.116 "is_configured": true, 00:29:21.116 "data_offset": 2048, 00:29:21.116 "data_size": 63488 00:29:21.116 }, 00:29:21.116 { 00:29:21.116 "name": "BaseBdev2", 00:29:21.116 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:21.116 "is_configured": true, 00:29:21.116 "data_offset": 2048, 00:29:21.116 "data_size": 63488 00:29:21.116 } 00:29:21.116 ] 00:29:21.116 }' 00:29:21.116 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:21.116 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:21.116 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:21.116 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:21.116 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:21.375 [2024-07-23 17:23:16.718929] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:21.375 [2024-07-23 17:23:16.719352] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:21.375 [2024-07-23 17:23:16.719398] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:21.375 [2024-07-23 17:23:16.719414] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:21.375 [2024-07-23 17:23:16.719422] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.375 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.634 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:21.634 "name": "raid_bdev1", 00:29:21.634 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:21.634 "strip_size_kb": 0, 00:29:21.634 "state": "online", 00:29:21.634 "raid_level": "raid1", 00:29:21.634 "superblock": true, 00:29:21.634 "num_base_bdevs": 2, 00:29:21.634 "num_base_bdevs_discovered": 1, 00:29:21.634 "num_base_bdevs_operational": 1, 00:29:21.634 "base_bdevs_list": [ 00:29:21.634 { 00:29:21.634 "name": null, 00:29:21.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:21.634 "is_configured": false, 00:29:21.634 "data_offset": 2048, 00:29:21.634 "data_size": 63488 00:29:21.634 }, 00:29:21.634 { 00:29:21.634 "name": "BaseBdev2", 00:29:21.634 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:21.634 "is_configured": true, 00:29:21.634 "data_offset": 2048, 00:29:21.634 "data_size": 63488 00:29:21.634 } 00:29:21.634 ] 00:29:21.634 }' 00:29:21.634 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:21.634 17:23:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:22.201 17:23:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:22.497 [2024-07-23 17:23:17.799807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:22.497 [2024-07-23 17:23:17.799863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:22.497 [2024-07-23 17:23:17.799884] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb35630 00:29:22.497 [2024-07-23 17:23:17.799902] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:22.497 [2024-07-23 17:23:17.800278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:22.497 [2024-07-23 17:23:17.800295] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:22.497 [2024-07-23 17:23:17.800377] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:22.497 [2024-07-23 17:23:17.800389] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:22.497 [2024-07-23 17:23:17.800399] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:22.497 [2024-07-23 17:23:17.800417] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:22.497 [2024-07-23 17:23:17.805594] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xce4390 00:29:22.497 spare 00:29:22.497 [2024-07-23 17:23:17.807033] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:22.497 17:23:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.448 17:23:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.706 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:23.706 "name": "raid_bdev1", 00:29:23.706 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:23.706 "strip_size_kb": 0, 00:29:23.706 "state": "online", 00:29:23.706 "raid_level": "raid1", 00:29:23.706 "superblock": true, 00:29:23.706 "num_base_bdevs": 2, 00:29:23.706 "num_base_bdevs_discovered": 2, 00:29:23.706 "num_base_bdevs_operational": 2, 00:29:23.706 "process": { 00:29:23.706 "type": "rebuild", 00:29:23.706 "target": "spare", 00:29:23.706 "progress": { 00:29:23.706 "blocks": 24576, 00:29:23.706 "percent": 38 00:29:23.706 } 00:29:23.706 }, 00:29:23.706 "base_bdevs_list": [ 00:29:23.706 { 00:29:23.706 "name": "spare", 00:29:23.706 "uuid": "d77f1e3d-595a-5790-80f9-8b717abe15c1", 00:29:23.706 "is_configured": true, 00:29:23.707 "data_offset": 2048, 00:29:23.707 "data_size": 63488 00:29:23.707 }, 00:29:23.707 { 00:29:23.707 "name": "BaseBdev2", 00:29:23.707 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:23.707 "is_configured": true, 00:29:23.707 "data_offset": 2048, 00:29:23.707 "data_size": 63488 00:29:23.707 } 00:29:23.707 ] 00:29:23.707 }' 00:29:23.707 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:23.965 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:23.965 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:23.965 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:23.965 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:24.225 [2024-07-23 17:23:19.418448] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:24.225 [2024-07-23 17:23:19.419804] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:24.225 [2024-07-23 17:23:19.419849] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:24.225 [2024-07-23 17:23:19.419864] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:24.225 [2024-07-23 17:23:19.419872] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.225 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:24.484 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:24.484 "name": "raid_bdev1", 00:29:24.484 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:24.484 "strip_size_kb": 0, 00:29:24.484 "state": "online", 00:29:24.484 "raid_level": "raid1", 00:29:24.484 "superblock": true, 00:29:24.484 "num_base_bdevs": 2, 00:29:24.484 "num_base_bdevs_discovered": 1, 00:29:24.484 "num_base_bdevs_operational": 1, 00:29:24.484 "base_bdevs_list": [ 00:29:24.484 { 00:29:24.484 "name": null, 00:29:24.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:24.484 "is_configured": false, 00:29:24.484 "data_offset": 2048, 00:29:24.484 "data_size": 63488 00:29:24.484 }, 00:29:24.484 { 00:29:24.484 "name": "BaseBdev2", 00:29:24.484 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:24.484 "is_configured": true, 00:29:24.484 "data_offset": 2048, 00:29:24.484 "data_size": 63488 00:29:24.484 } 00:29:24.484 ] 00:29:24.484 }' 00:29:24.484 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:24.484 17:23:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.050 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.308 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:25.308 "name": "raid_bdev1", 00:29:25.308 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:25.308 "strip_size_kb": 0, 00:29:25.308 "state": "online", 00:29:25.308 "raid_level": "raid1", 00:29:25.308 "superblock": true, 00:29:25.308 "num_base_bdevs": 2, 00:29:25.308 "num_base_bdevs_discovered": 1, 00:29:25.308 "num_base_bdevs_operational": 1, 00:29:25.308 "base_bdevs_list": [ 00:29:25.308 { 00:29:25.308 "name": null, 00:29:25.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.308 "is_configured": false, 00:29:25.308 "data_offset": 2048, 00:29:25.308 "data_size": 63488 00:29:25.308 }, 00:29:25.308 { 00:29:25.308 "name": "BaseBdev2", 00:29:25.308 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:25.308 "is_configured": true, 00:29:25.308 "data_offset": 2048, 00:29:25.308 "data_size": 63488 00:29:25.308 } 00:29:25.308 ] 00:29:25.308 }' 00:29:25.308 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:25.308 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:25.308 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:25.308 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:25.308 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:25.567 17:23:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:25.567 [2024-07-23 17:23:20.985533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:25.567 [2024-07-23 17:23:20.985583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:25.567 [2024-07-23 17:23:20.985605] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd4760 00:29:25.567 [2024-07-23 17:23:20.985617] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:25.567 [2024-07-23 17:23:20.985962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:25.567 [2024-07-23 17:23:20.985980] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:25.567 [2024-07-23 17:23:20.986044] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:25.567 [2024-07-23 17:23:20.986056] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:25.567 [2024-07-23 17:23:20.986066] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:25.826 BaseBdev1 00:29:25.826 17:23:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.760 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.019 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.019 "name": "raid_bdev1", 00:29:27.019 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:27.019 "strip_size_kb": 0, 00:29:27.019 "state": "online", 00:29:27.019 "raid_level": "raid1", 00:29:27.019 "superblock": true, 00:29:27.019 "num_base_bdevs": 2, 00:29:27.019 "num_base_bdevs_discovered": 1, 00:29:27.019 "num_base_bdevs_operational": 1, 00:29:27.019 "base_bdevs_list": [ 00:29:27.019 { 00:29:27.019 "name": null, 00:29:27.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.019 "is_configured": false, 00:29:27.019 "data_offset": 2048, 00:29:27.019 "data_size": 63488 00:29:27.019 }, 00:29:27.019 { 00:29:27.019 "name": "BaseBdev2", 00:29:27.019 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:27.019 "is_configured": true, 00:29:27.019 "data_offset": 2048, 00:29:27.019 "data_size": 63488 00:29:27.019 } 00:29:27.019 ] 00:29:27.019 }' 00:29:27.019 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.019 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.586 17:23:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.844 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:27.844 "name": "raid_bdev1", 00:29:27.844 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:27.844 "strip_size_kb": 0, 00:29:27.844 "state": "online", 00:29:27.844 "raid_level": "raid1", 00:29:27.844 "superblock": true, 00:29:27.844 "num_base_bdevs": 2, 00:29:27.844 "num_base_bdevs_discovered": 1, 00:29:27.844 "num_base_bdevs_operational": 1, 00:29:27.844 "base_bdevs_list": [ 00:29:27.844 { 00:29:27.844 "name": null, 00:29:27.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.844 "is_configured": false, 00:29:27.844 "data_offset": 2048, 00:29:27.844 "data_size": 63488 00:29:27.844 }, 00:29:27.844 { 00:29:27.844 "name": "BaseBdev2", 00:29:27.844 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:27.844 "is_configured": true, 00:29:27.844 "data_offset": 2048, 00:29:27.844 "data_size": 63488 00:29:27.845 } 00:29:27.845 ] 00:29:27.845 }' 00:29:27.845 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:28.102 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:28.103 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:28.669 [2024-07-23 17:23:23.857625] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:28.669 [2024-07-23 17:23:23.857759] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:28.669 [2024-07-23 17:23:23.857775] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:28.669 request: 00:29:28.669 { 00:29:28.669 "base_bdev": "BaseBdev1", 00:29:28.669 "raid_bdev": "raid_bdev1", 00:29:28.669 "method": "bdev_raid_add_base_bdev", 00:29:28.669 "req_id": 1 00:29:28.669 } 00:29:28.669 Got JSON-RPC error response 00:29:28.669 response: 00:29:28.669 { 00:29:28.669 "code": -22, 00:29:28.669 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:28.669 } 00:29:28.669 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:29:28.669 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:28.669 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:28.669 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:28.669 17:23:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.603 17:23:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:29.862 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:29.862 "name": "raid_bdev1", 00:29:29.862 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:29.862 "strip_size_kb": 0, 00:29:29.862 "state": "online", 00:29:29.862 "raid_level": "raid1", 00:29:29.862 "superblock": true, 00:29:29.862 "num_base_bdevs": 2, 00:29:29.862 "num_base_bdevs_discovered": 1, 00:29:29.862 "num_base_bdevs_operational": 1, 00:29:29.862 "base_bdevs_list": [ 00:29:29.862 { 00:29:29.862 "name": null, 00:29:29.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:29.862 "is_configured": false, 00:29:29.862 "data_offset": 2048, 00:29:29.862 "data_size": 63488 00:29:29.862 }, 00:29:29.862 { 00:29:29.862 "name": "BaseBdev2", 00:29:29.862 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:29.862 "is_configured": true, 00:29:29.862 "data_offset": 2048, 00:29:29.862 "data_size": 63488 00:29:29.862 } 00:29:29.862 ] 00:29:29.862 }' 00:29:29.862 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:29.862 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.797 17:23:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:30.797 "name": "raid_bdev1", 00:29:30.797 "uuid": "ad0e08af-e17a-47bf-a394-181c0de5334e", 00:29:30.797 "strip_size_kb": 0, 00:29:30.797 "state": "online", 00:29:30.797 "raid_level": "raid1", 00:29:30.797 "superblock": true, 00:29:30.797 "num_base_bdevs": 2, 00:29:30.797 "num_base_bdevs_discovered": 1, 00:29:30.797 "num_base_bdevs_operational": 1, 00:29:30.797 "base_bdevs_list": [ 00:29:30.797 { 00:29:30.797 "name": null, 00:29:30.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:30.797 "is_configured": false, 00:29:30.797 "data_offset": 2048, 00:29:30.797 "data_size": 63488 00:29:30.797 }, 00:29:30.797 { 00:29:30.797 "name": "BaseBdev2", 00:29:30.797 "uuid": "3c0520ac-1a53-5da2-b498-ce91a08051f0", 00:29:30.797 "is_configured": true, 00:29:30.797 "data_offset": 2048, 00:29:30.797 "data_size": 63488 00:29:30.797 } 00:29:30.797 ] 00:29:30.797 }' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 37461 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 37461 ']' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 37461 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 37461 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 37461' 00:29:30.797 killing process with pid 37461 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 37461 00:29:30.797 Received shutdown signal, test time was about 27.720476 seconds 00:29:30.797 00:29:30.797 Latency(us) 00:29:30.797 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:30.797 =================================================================================================================== 00:29:30.797 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:30.797 [2024-07-23 17:23:26.217144] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:30.797 [2024-07-23 17:23:26.217245] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:30.797 [2024-07-23 17:23:26.217299] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:30.797 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 37461 00:29:30.797 [2024-07-23 17:23:26.217312] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd4a80 name raid_bdev1, state offline 00:29:31.057 [2024-07-23 17:23:26.241644] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:31.057 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:29:31.057 00:29:31.057 real 0m31.825s 00:29:31.057 user 0m50.288s 00:29:31.057 sys 0m4.666s 00:29:31.057 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:31.057 17:23:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:29:31.057 ************************************ 00:29:31.057 END TEST raid_rebuild_test_sb_io 00:29:31.057 ************************************ 00:29:31.316 17:23:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:31.316 17:23:26 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:29:31.316 17:23:26 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:29:31.316 17:23:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:31.316 17:23:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:31.316 17:23:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:31.316 ************************************ 00:29:31.316 START TEST raid_rebuild_test 00:29:31.316 ************************************ 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=41962 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 41962 /var/tmp/spdk-raid.sock 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 41962 ']' 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:31.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:31.316 17:23:26 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:29:31.316 [2024-07-23 17:23:26.624102] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:29:31.316 [2024-07-23 17:23:26.624173] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid41962 ] 00:29:31.316 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:31.316 Zero copy mechanism will not be used. 00:29:31.574 [2024-07-23 17:23:26.756304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:31.574 [2024-07-23 17:23:26.809295] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:31.574 [2024-07-23 17:23:26.864404] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:31.574 [2024-07-23 17:23:26.864433] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:32.137 17:23:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:32.137 17:23:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:29:32.137 17:23:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:32.137 17:23:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:29:32.394 BaseBdev1_malloc 00:29:32.394 17:23:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:32.652 [2024-07-23 17:23:28.036539] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:32.652 [2024-07-23 17:23:28.036589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:32.652 [2024-07-23 17:23:28.036615] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c83170 00:29:32.652 [2024-07-23 17:23:28.036628] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:32.652 [2024-07-23 17:23:28.038336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:32.652 [2024-07-23 17:23:28.038364] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:32.652 BaseBdev1 00:29:32.652 17:23:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:32.652 17:23:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:29:32.911 BaseBdev2_malloc 00:29:32.911 17:23:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:33.169 [2024-07-23 17:23:28.527725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:33.169 [2024-07-23 17:23:28.527770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:33.169 [2024-07-23 17:23:28.527793] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b69680 00:29:33.169 [2024-07-23 17:23:28.527812] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:33.169 [2024-07-23 17:23:28.529400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:33.169 [2024-07-23 17:23:28.529427] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:33.169 BaseBdev2 00:29:33.169 17:23:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:33.169 17:23:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:29:33.428 BaseBdev3_malloc 00:29:33.428 17:23:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:29:33.687 [2024-07-23 17:23:29.022696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:29:33.687 [2024-07-23 17:23:29.022742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:33.687 [2024-07-23 17:23:29.022766] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6a3d0 00:29:33.687 [2024-07-23 17:23:29.022778] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:33.687 [2024-07-23 17:23:29.024325] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:33.687 [2024-07-23 17:23:29.024352] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:29:33.687 BaseBdev3 00:29:33.687 17:23:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:33.687 17:23:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:29:33.946 BaseBdev4_malloc 00:29:33.946 17:23:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:29:34.205 [2024-07-23 17:23:29.516556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:29:34.205 [2024-07-23 17:23:29.516604] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:34.205 [2024-07-23 17:23:29.516626] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6e2c0 00:29:34.205 [2024-07-23 17:23:29.516638] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:34.205 [2024-07-23 17:23:29.518200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:34.205 [2024-07-23 17:23:29.518228] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:29:34.205 BaseBdev4 00:29:34.205 17:23:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:29:34.463 spare_malloc 00:29:34.463 17:23:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:34.722 spare_delay 00:29:34.722 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:34.980 [2024-07-23 17:23:30.259039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:34.980 [2024-07-23 17:23:30.259086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:34.980 [2024-07-23 17:23:30.259108] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b6ce80 00:29:34.980 [2024-07-23 17:23:30.259120] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:34.980 [2024-07-23 17:23:30.260731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:34.980 [2024-07-23 17:23:30.260759] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:34.980 spare 00:29:34.980 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:29:35.239 [2024-07-23 17:23:30.503706] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:35.239 [2024-07-23 17:23:30.505042] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:35.239 [2024-07-23 17:23:30.505098] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:35.239 [2024-07-23 17:23:30.505143] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:29:35.239 [2024-07-23 17:23:30.505223] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad2530 00:29:35.239 [2024-07-23 17:23:30.505233] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:29:35.239 [2024-07-23 17:23:30.505448] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b88680 00:29:35.239 [2024-07-23 17:23:30.505599] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad2530 00:29:35.239 [2024-07-23 17:23:30.505609] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad2530 00:29:35.239 [2024-07-23 17:23:30.505726] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.239 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:35.497 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:35.497 "name": "raid_bdev1", 00:29:35.497 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:35.497 "strip_size_kb": 0, 00:29:35.497 "state": "online", 00:29:35.497 "raid_level": "raid1", 00:29:35.497 "superblock": false, 00:29:35.497 "num_base_bdevs": 4, 00:29:35.497 "num_base_bdevs_discovered": 4, 00:29:35.497 "num_base_bdevs_operational": 4, 00:29:35.497 "base_bdevs_list": [ 00:29:35.497 { 00:29:35.497 "name": "BaseBdev1", 00:29:35.497 "uuid": "74368173-ff3c-5387-b1a3-ef236ef8ab97", 00:29:35.498 "is_configured": true, 00:29:35.498 "data_offset": 0, 00:29:35.498 "data_size": 65536 00:29:35.498 }, 00:29:35.498 { 00:29:35.498 "name": "BaseBdev2", 00:29:35.498 "uuid": "e8b26f4e-1d62-5c2e-8af3-4d38c6bfa9d1", 00:29:35.498 "is_configured": true, 00:29:35.498 "data_offset": 0, 00:29:35.498 "data_size": 65536 00:29:35.498 }, 00:29:35.498 { 00:29:35.498 "name": "BaseBdev3", 00:29:35.498 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:35.498 "is_configured": true, 00:29:35.498 "data_offset": 0, 00:29:35.498 "data_size": 65536 00:29:35.498 }, 00:29:35.498 { 00:29:35.498 "name": "BaseBdev4", 00:29:35.498 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:35.498 "is_configured": true, 00:29:35.498 "data_offset": 0, 00:29:35.498 "data_size": 65536 00:29:35.498 } 00:29:35.498 ] 00:29:35.498 }' 00:29:35.498 17:23:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:35.498 17:23:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:29:36.065 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:36.065 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:36.324 [2024-07-23 17:23:31.586822] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:36.324 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:29:36.324 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.324 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:36.583 17:23:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:29:36.843 [2024-07-23 17:23:32.115970] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b86e60 00:29:36.843 /dev/nbd0 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:36.843 1+0 records in 00:29:36.843 1+0 records out 00:29:36.843 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253625 s, 16.1 MB/s 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:29:36.843 17:23:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:29:44.993 65536+0 records in 00:29:44.993 65536+0 records out 00:29:44.993 33554432 bytes (34 MB, 32 MiB) copied, 7.9294 s, 4.2 MB/s 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:44.993 [2024-07-23 17:23:40.375317] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:29:44.993 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:45.252 [2024-07-23 17:23:40.607976] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.252 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.511 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.511 "name": "raid_bdev1", 00:29:45.511 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:45.511 "strip_size_kb": 0, 00:29:45.511 "state": "online", 00:29:45.511 "raid_level": "raid1", 00:29:45.511 "superblock": false, 00:29:45.511 "num_base_bdevs": 4, 00:29:45.511 "num_base_bdevs_discovered": 3, 00:29:45.511 "num_base_bdevs_operational": 3, 00:29:45.511 "base_bdevs_list": [ 00:29:45.511 { 00:29:45.511 "name": null, 00:29:45.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.511 "is_configured": false, 00:29:45.511 "data_offset": 0, 00:29:45.511 "data_size": 65536 00:29:45.511 }, 00:29:45.511 { 00:29:45.511 "name": "BaseBdev2", 00:29:45.511 "uuid": "e8b26f4e-1d62-5c2e-8af3-4d38c6bfa9d1", 00:29:45.511 "is_configured": true, 00:29:45.511 "data_offset": 0, 00:29:45.511 "data_size": 65536 00:29:45.511 }, 00:29:45.511 { 00:29:45.511 "name": "BaseBdev3", 00:29:45.511 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:45.511 "is_configured": true, 00:29:45.511 "data_offset": 0, 00:29:45.511 "data_size": 65536 00:29:45.511 }, 00:29:45.511 { 00:29:45.511 "name": "BaseBdev4", 00:29:45.511 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:45.511 "is_configured": true, 00:29:45.511 "data_offset": 0, 00:29:45.511 "data_size": 65536 00:29:45.511 } 00:29:45.511 ] 00:29:45.511 }' 00:29:45.511 17:23:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.511 17:23:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:29:46.079 17:23:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:46.338 [2024-07-23 17:23:41.694850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:46.338 [2024-07-23 17:23:41.698883] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b6c500 00:29:46.338 [2024-07-23 17:23:41.701310] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:46.338 17:23:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:47.715 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:47.715 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:47.715 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:47.715 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:47.715 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:47.716 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.716 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:47.716 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:47.716 "name": "raid_bdev1", 00:29:47.716 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:47.716 "strip_size_kb": 0, 00:29:47.716 "state": "online", 00:29:47.716 "raid_level": "raid1", 00:29:47.716 "superblock": false, 00:29:47.716 "num_base_bdevs": 4, 00:29:47.716 "num_base_bdevs_discovered": 4, 00:29:47.716 "num_base_bdevs_operational": 4, 00:29:47.716 "process": { 00:29:47.716 "type": "rebuild", 00:29:47.716 "target": "spare", 00:29:47.716 "progress": { 00:29:47.716 "blocks": 24576, 00:29:47.716 "percent": 37 00:29:47.716 } 00:29:47.716 }, 00:29:47.716 "base_bdevs_list": [ 00:29:47.716 { 00:29:47.716 "name": "spare", 00:29:47.716 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:47.716 "is_configured": true, 00:29:47.716 "data_offset": 0, 00:29:47.716 "data_size": 65536 00:29:47.716 }, 00:29:47.716 { 00:29:47.716 "name": "BaseBdev2", 00:29:47.716 "uuid": "e8b26f4e-1d62-5c2e-8af3-4d38c6bfa9d1", 00:29:47.716 "is_configured": true, 00:29:47.716 "data_offset": 0, 00:29:47.716 "data_size": 65536 00:29:47.716 }, 00:29:47.716 { 00:29:47.716 "name": "BaseBdev3", 00:29:47.716 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:47.716 "is_configured": true, 00:29:47.716 "data_offset": 0, 00:29:47.716 "data_size": 65536 00:29:47.716 }, 00:29:47.716 { 00:29:47.716 "name": "BaseBdev4", 00:29:47.716 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:47.716 "is_configured": true, 00:29:47.716 "data_offset": 0, 00:29:47.716 "data_size": 65536 00:29:47.716 } 00:29:47.716 ] 00:29:47.716 }' 00:29:47.716 17:23:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:47.716 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:47.716 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:47.716 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:47.716 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:47.974 [2024-07-23 17:23:43.315761] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:48.233 [2024-07-23 17:23:43.414908] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:48.233 [2024-07-23 17:23:43.414952] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:48.233 [2024-07-23 17:23:43.414969] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:48.233 [2024-07-23 17:23:43.414977] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.233 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:48.491 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:48.491 "name": "raid_bdev1", 00:29:48.491 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:48.491 "strip_size_kb": 0, 00:29:48.491 "state": "online", 00:29:48.491 "raid_level": "raid1", 00:29:48.491 "superblock": false, 00:29:48.491 "num_base_bdevs": 4, 00:29:48.491 "num_base_bdevs_discovered": 3, 00:29:48.491 "num_base_bdevs_operational": 3, 00:29:48.491 "base_bdevs_list": [ 00:29:48.491 { 00:29:48.491 "name": null, 00:29:48.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:48.491 "is_configured": false, 00:29:48.491 "data_offset": 0, 00:29:48.491 "data_size": 65536 00:29:48.491 }, 00:29:48.491 { 00:29:48.491 "name": "BaseBdev2", 00:29:48.491 "uuid": "e8b26f4e-1d62-5c2e-8af3-4d38c6bfa9d1", 00:29:48.491 "is_configured": true, 00:29:48.491 "data_offset": 0, 00:29:48.491 "data_size": 65536 00:29:48.491 }, 00:29:48.491 { 00:29:48.491 "name": "BaseBdev3", 00:29:48.491 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:48.491 "is_configured": true, 00:29:48.491 "data_offset": 0, 00:29:48.491 "data_size": 65536 00:29:48.491 }, 00:29:48.491 { 00:29:48.491 "name": "BaseBdev4", 00:29:48.491 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:48.491 "is_configured": true, 00:29:48.491 "data_offset": 0, 00:29:48.491 "data_size": 65536 00:29:48.491 } 00:29:48.491 ] 00:29:48.491 }' 00:29:48.491 17:23:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:48.491 17:23:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.059 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:49.318 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:49.318 "name": "raid_bdev1", 00:29:49.318 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:49.318 "strip_size_kb": 0, 00:29:49.318 "state": "online", 00:29:49.318 "raid_level": "raid1", 00:29:49.318 "superblock": false, 00:29:49.318 "num_base_bdevs": 4, 00:29:49.318 "num_base_bdevs_discovered": 3, 00:29:49.318 "num_base_bdevs_operational": 3, 00:29:49.318 "base_bdevs_list": [ 00:29:49.318 { 00:29:49.318 "name": null, 00:29:49.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:49.318 "is_configured": false, 00:29:49.318 "data_offset": 0, 00:29:49.318 "data_size": 65536 00:29:49.318 }, 00:29:49.318 { 00:29:49.318 "name": "BaseBdev2", 00:29:49.318 "uuid": "e8b26f4e-1d62-5c2e-8af3-4d38c6bfa9d1", 00:29:49.318 "is_configured": true, 00:29:49.318 "data_offset": 0, 00:29:49.318 "data_size": 65536 00:29:49.318 }, 00:29:49.318 { 00:29:49.318 "name": "BaseBdev3", 00:29:49.318 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:49.318 "is_configured": true, 00:29:49.318 "data_offset": 0, 00:29:49.318 "data_size": 65536 00:29:49.318 }, 00:29:49.318 { 00:29:49.318 "name": "BaseBdev4", 00:29:49.318 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:49.318 "is_configured": true, 00:29:49.318 "data_offset": 0, 00:29:49.318 "data_size": 65536 00:29:49.318 } 00:29:49.318 ] 00:29:49.318 }' 00:29:49.318 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:49.318 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:49.318 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:49.318 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:49.318 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:49.577 [2024-07-23 17:23:44.871498] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:49.577 [2024-07-23 17:23:44.876031] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b87eb0 00:29:49.577 [2024-07-23 17:23:44.877547] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:49.577 17:23:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.514 17:23:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:50.773 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:50.773 "name": "raid_bdev1", 00:29:50.773 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:50.773 "strip_size_kb": 0, 00:29:50.773 "state": "online", 00:29:50.773 "raid_level": "raid1", 00:29:50.773 "superblock": false, 00:29:50.773 "num_base_bdevs": 4, 00:29:50.773 "num_base_bdevs_discovered": 4, 00:29:50.773 "num_base_bdevs_operational": 4, 00:29:50.773 "process": { 00:29:50.773 "type": "rebuild", 00:29:50.773 "target": "spare", 00:29:50.773 "progress": { 00:29:50.773 "blocks": 24576, 00:29:50.773 "percent": 37 00:29:50.773 } 00:29:50.773 }, 00:29:50.773 "base_bdevs_list": [ 00:29:50.773 { 00:29:50.773 "name": "spare", 00:29:50.773 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:50.773 "is_configured": true, 00:29:50.773 "data_offset": 0, 00:29:50.773 "data_size": 65536 00:29:50.773 }, 00:29:50.773 { 00:29:50.773 "name": "BaseBdev2", 00:29:50.773 "uuid": "e8b26f4e-1d62-5c2e-8af3-4d38c6bfa9d1", 00:29:50.773 "is_configured": true, 00:29:50.773 "data_offset": 0, 00:29:50.773 "data_size": 65536 00:29:50.773 }, 00:29:50.773 { 00:29:50.773 "name": "BaseBdev3", 00:29:50.773 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:50.773 "is_configured": true, 00:29:50.773 "data_offset": 0, 00:29:50.773 "data_size": 65536 00:29:50.773 }, 00:29:50.773 { 00:29:50.773 "name": "BaseBdev4", 00:29:50.773 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:50.773 "is_configured": true, 00:29:50.773 "data_offset": 0, 00:29:50.773 "data_size": 65536 00:29:50.773 } 00:29:50.773 ] 00:29:50.773 }' 00:29:50.773 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:50.773 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:50.773 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:51.032 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:51.032 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:29:51.032 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:29:51.032 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:51.032 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:29:51.032 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:29:51.291 [2024-07-23 17:23:46.456953] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:51.291 [2024-07-23 17:23:46.490429] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1b87eb0 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.291 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:51.550 "name": "raid_bdev1", 00:29:51.550 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:51.550 "strip_size_kb": 0, 00:29:51.550 "state": "online", 00:29:51.550 "raid_level": "raid1", 00:29:51.550 "superblock": false, 00:29:51.550 "num_base_bdevs": 4, 00:29:51.550 "num_base_bdevs_discovered": 3, 00:29:51.550 "num_base_bdevs_operational": 3, 00:29:51.550 "process": { 00:29:51.550 "type": "rebuild", 00:29:51.550 "target": "spare", 00:29:51.550 "progress": { 00:29:51.550 "blocks": 36864, 00:29:51.550 "percent": 56 00:29:51.550 } 00:29:51.550 }, 00:29:51.550 "base_bdevs_list": [ 00:29:51.550 { 00:29:51.550 "name": "spare", 00:29:51.550 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:51.550 "is_configured": true, 00:29:51.550 "data_offset": 0, 00:29:51.550 "data_size": 65536 00:29:51.550 }, 00:29:51.550 { 00:29:51.550 "name": null, 00:29:51.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.550 "is_configured": false, 00:29:51.550 "data_offset": 0, 00:29:51.550 "data_size": 65536 00:29:51.550 }, 00:29:51.550 { 00:29:51.550 "name": "BaseBdev3", 00:29:51.550 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:51.550 "is_configured": true, 00:29:51.550 "data_offset": 0, 00:29:51.550 "data_size": 65536 00:29:51.550 }, 00:29:51.550 { 00:29:51.550 "name": "BaseBdev4", 00:29:51.550 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:51.550 "is_configured": true, 00:29:51.550 "data_offset": 0, 00:29:51.550 "data_size": 65536 00:29:51.550 } 00:29:51.550 ] 00:29:51.550 }' 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=926 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.550 17:23:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.810 17:23:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:51.810 "name": "raid_bdev1", 00:29:51.810 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:51.810 "strip_size_kb": 0, 00:29:51.810 "state": "online", 00:29:51.810 "raid_level": "raid1", 00:29:51.810 "superblock": false, 00:29:51.810 "num_base_bdevs": 4, 00:29:51.810 "num_base_bdevs_discovered": 3, 00:29:51.810 "num_base_bdevs_operational": 3, 00:29:51.810 "process": { 00:29:51.810 "type": "rebuild", 00:29:51.810 "target": "spare", 00:29:51.810 "progress": { 00:29:51.810 "blocks": 45056, 00:29:51.810 "percent": 68 00:29:51.810 } 00:29:51.810 }, 00:29:51.810 "base_bdevs_list": [ 00:29:51.810 { 00:29:51.810 "name": "spare", 00:29:51.810 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:51.810 "is_configured": true, 00:29:51.810 "data_offset": 0, 00:29:51.810 "data_size": 65536 00:29:51.810 }, 00:29:51.810 { 00:29:51.810 "name": null, 00:29:51.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.810 "is_configured": false, 00:29:51.810 "data_offset": 0, 00:29:51.810 "data_size": 65536 00:29:51.810 }, 00:29:51.810 { 00:29:51.810 "name": "BaseBdev3", 00:29:51.810 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:51.810 "is_configured": true, 00:29:51.810 "data_offset": 0, 00:29:51.810 "data_size": 65536 00:29:51.810 }, 00:29:51.810 { 00:29:51.810 "name": "BaseBdev4", 00:29:51.810 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:51.810 "is_configured": true, 00:29:51.810 "data_offset": 0, 00:29:51.810 "data_size": 65536 00:29:51.810 } 00:29:51.810 ] 00:29:51.810 }' 00:29:51.810 17:23:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:51.810 17:23:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:51.810 17:23:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:51.810 17:23:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:51.810 17:23:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:52.745 [2024-07-23 17:23:48.102899] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:52.745 [2024-07-23 17:23:48.102962] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:52.745 [2024-07-23 17:23:48.103000] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.003 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:53.261 "name": "raid_bdev1", 00:29:53.261 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:53.261 "strip_size_kb": 0, 00:29:53.261 "state": "online", 00:29:53.261 "raid_level": "raid1", 00:29:53.261 "superblock": false, 00:29:53.261 "num_base_bdevs": 4, 00:29:53.261 "num_base_bdevs_discovered": 3, 00:29:53.261 "num_base_bdevs_operational": 3, 00:29:53.261 "base_bdevs_list": [ 00:29:53.261 { 00:29:53.261 "name": "spare", 00:29:53.261 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:53.261 "is_configured": true, 00:29:53.261 "data_offset": 0, 00:29:53.261 "data_size": 65536 00:29:53.261 }, 00:29:53.261 { 00:29:53.261 "name": null, 00:29:53.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:53.261 "is_configured": false, 00:29:53.261 "data_offset": 0, 00:29:53.261 "data_size": 65536 00:29:53.261 }, 00:29:53.261 { 00:29:53.261 "name": "BaseBdev3", 00:29:53.261 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:53.261 "is_configured": true, 00:29:53.261 "data_offset": 0, 00:29:53.261 "data_size": 65536 00:29:53.261 }, 00:29:53.261 { 00:29:53.261 "name": "BaseBdev4", 00:29:53.261 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:53.261 "is_configured": true, 00:29:53.261 "data_offset": 0, 00:29:53.261 "data_size": 65536 00:29:53.261 } 00:29:53.261 ] 00:29:53.261 }' 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:53.261 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.519 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:53.519 "name": "raid_bdev1", 00:29:53.519 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:53.519 "strip_size_kb": 0, 00:29:53.519 "state": "online", 00:29:53.519 "raid_level": "raid1", 00:29:53.520 "superblock": false, 00:29:53.520 "num_base_bdevs": 4, 00:29:53.520 "num_base_bdevs_discovered": 3, 00:29:53.520 "num_base_bdevs_operational": 3, 00:29:53.520 "base_bdevs_list": [ 00:29:53.520 { 00:29:53.520 "name": "spare", 00:29:53.520 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:53.520 "is_configured": true, 00:29:53.520 "data_offset": 0, 00:29:53.520 "data_size": 65536 00:29:53.520 }, 00:29:53.520 { 00:29:53.520 "name": null, 00:29:53.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:53.520 "is_configured": false, 00:29:53.520 "data_offset": 0, 00:29:53.520 "data_size": 65536 00:29:53.520 }, 00:29:53.520 { 00:29:53.520 "name": "BaseBdev3", 00:29:53.520 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:53.520 "is_configured": true, 00:29:53.520 "data_offset": 0, 00:29:53.520 "data_size": 65536 00:29:53.520 }, 00:29:53.520 { 00:29:53.520 "name": "BaseBdev4", 00:29:53.520 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:53.520 "is_configured": true, 00:29:53.520 "data_offset": 0, 00:29:53.520 "data_size": 65536 00:29:53.520 } 00:29:53.520 ] 00:29:53.520 }' 00:29:53.520 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:53.520 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:53.520 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.778 17:23:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:54.036 17:23:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:54.036 "name": "raid_bdev1", 00:29:54.036 "uuid": "be11cf9f-eccb-4ac3-8e60-34378d0e665a", 00:29:54.036 "strip_size_kb": 0, 00:29:54.036 "state": "online", 00:29:54.036 "raid_level": "raid1", 00:29:54.036 "superblock": false, 00:29:54.036 "num_base_bdevs": 4, 00:29:54.036 "num_base_bdevs_discovered": 3, 00:29:54.036 "num_base_bdevs_operational": 3, 00:29:54.036 "base_bdevs_list": [ 00:29:54.036 { 00:29:54.036 "name": "spare", 00:29:54.036 "uuid": "3222e2a3-767f-5c47-b569-dd5fba025661", 00:29:54.036 "is_configured": true, 00:29:54.036 "data_offset": 0, 00:29:54.036 "data_size": 65536 00:29:54.036 }, 00:29:54.036 { 00:29:54.036 "name": null, 00:29:54.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:54.036 "is_configured": false, 00:29:54.036 "data_offset": 0, 00:29:54.036 "data_size": 65536 00:29:54.036 }, 00:29:54.036 { 00:29:54.036 "name": "BaseBdev3", 00:29:54.036 "uuid": "31b73236-a6f3-54a9-8f42-f445f790734f", 00:29:54.036 "is_configured": true, 00:29:54.036 "data_offset": 0, 00:29:54.036 "data_size": 65536 00:29:54.036 }, 00:29:54.036 { 00:29:54.036 "name": "BaseBdev4", 00:29:54.036 "uuid": "96bd8425-b2df-5ff1-826e-e50fcc58c109", 00:29:54.036 "is_configured": true, 00:29:54.036 "data_offset": 0, 00:29:54.036 "data_size": 65536 00:29:54.036 } 00:29:54.036 ] 00:29:54.036 }' 00:29:54.036 17:23:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:54.036 17:23:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:29:54.604 17:23:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:54.604 [2024-07-23 17:23:49.979985] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:54.604 [2024-07-23 17:23:49.980014] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:54.604 [2024-07-23 17:23:49.980069] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:54.604 [2024-07-23 17:23:49.980133] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:54.604 [2024-07-23 17:23:49.980144] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad2530 name raid_bdev1, state offline 00:29:54.604 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:29:54.604 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:54.863 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:55.432 /dev/nbd0 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:55.432 1+0 records in 00:29:55.432 1+0 records out 00:29:55.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281505 s, 14.6 MB/s 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:55.432 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:55.432 /dev/nbd1 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:55.691 1+0 records in 00:29:55.691 1+0 records out 00:29:55.691 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337013 s, 12.2 MB/s 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:55.691 17:23:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:55.950 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 41962 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 41962 ']' 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 41962 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 41962 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 41962' 00:29:56.209 killing process with pid 41962 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 41962 00:29:56.209 Received shutdown signal, test time was about 60.000000 seconds 00:29:56.209 00:29:56.209 Latency(us) 00:29:56.209 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:56.209 =================================================================================================================== 00:29:56.209 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:56.209 [2024-07-23 17:23:51.430068] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:56.209 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 41962 00:29:56.209 [2024-07-23 17:23:51.478266] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:29:56.469 00:29:56.469 real 0m25.130s 00:29:56.469 user 0m33.607s 00:29:56.469 sys 0m5.508s 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:29:56.469 ************************************ 00:29:56.469 END TEST raid_rebuild_test 00:29:56.469 ************************************ 00:29:56.469 17:23:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:56.469 17:23:51 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:29:56.469 17:23:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:56.469 17:23:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:56.469 17:23:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:56.469 ************************************ 00:29:56.469 START TEST raid_rebuild_test_sb 00:29:56.469 ************************************ 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=45504 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 45504 /var/tmp/spdk-raid.sock 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 45504 ']' 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:56.469 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:56.469 17:23:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:29:56.469 [2024-07-23 17:23:51.849475] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:29:56.469 [2024-07-23 17:23:51.849554] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid45504 ] 00:29:56.469 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:56.469 Zero copy mechanism will not be used. 00:29:56.729 [2024-07-23 17:23:51.982855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:56.729 [2024-07-23 17:23:52.038522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:56.729 [2024-07-23 17:23:52.097474] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:56.729 [2024-07-23 17:23:52.097503] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:57.336 17:23:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:57.336 17:23:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:29:57.336 17:23:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:57.336 17:23:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:29:57.615 BaseBdev1_malloc 00:29:57.615 17:23:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:57.873 [2024-07-23 17:23:53.133517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:57.873 [2024-07-23 17:23:53.133566] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:57.873 [2024-07-23 17:23:53.133591] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ce170 00:29:57.873 [2024-07-23 17:23:53.133604] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:57.873 [2024-07-23 17:23:53.135135] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:57.873 [2024-07-23 17:23:53.135164] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:57.873 BaseBdev1 00:29:57.873 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:57.873 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:29:58.139 BaseBdev2_malloc 00:29:58.139 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:58.139 [2024-07-23 17:23:53.519294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:58.139 [2024-07-23 17:23:53.519346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:58.139 [2024-07-23 17:23:53.519366] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b4680 00:29:58.139 [2024-07-23 17:23:53.519378] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:58.139 [2024-07-23 17:23:53.520736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:58.139 [2024-07-23 17:23:53.520765] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:58.139 BaseBdev2 00:29:58.139 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:58.139 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:29:58.401 BaseBdev3_malloc 00:29:58.401 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:29:58.659 [2024-07-23 17:23:53.904961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:29:58.659 [2024-07-23 17:23:53.905015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:58.659 [2024-07-23 17:23:53.905045] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b53d0 00:29:58.659 [2024-07-23 17:23:53.905058] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:58.659 [2024-07-23 17:23:53.906437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:58.659 [2024-07-23 17:23:53.906466] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:29:58.659 BaseBdev3 00:29:58.659 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:58.659 17:23:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:29:58.918 BaseBdev4_malloc 00:29:58.918 17:23:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:29:58.918 [2024-07-23 17:23:54.290547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:29:58.918 [2024-07-23 17:23:54.290597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:58.918 [2024-07-23 17:23:54.290617] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b92c0 00:29:58.918 [2024-07-23 17:23:54.290630] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:58.918 [2024-07-23 17:23:54.292026] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:58.918 [2024-07-23 17:23:54.292055] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:29:58.918 BaseBdev4 00:29:58.918 17:23:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:29:59.177 spare_malloc 00:29:59.177 17:23:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:59.436 spare_delay 00:29:59.436 17:23:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:59.436 [2024-07-23 17:23:54.856602] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:59.436 [2024-07-23 17:23:54.856649] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:59.436 [2024-07-23 17:23:54.856668] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b7e80 00:29:59.436 [2024-07-23 17:23:54.856680] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:59.695 [2024-07-23 17:23:54.858040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:59.695 [2024-07-23 17:23:54.858067] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:59.695 spare 00:29:59.695 17:23:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:29:59.695 [2024-07-23 17:23:55.041139] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:59.695 [2024-07-23 17:23:55.042262] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:59.695 [2024-07-23 17:23:55.042316] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:29:59.695 [2024-07-23 17:23:55.042360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:29:59.695 [2024-07-23 17:23:55.042538] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x221d530 00:29:59.695 [2024-07-23 17:23:55.042549] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:29:59.695 [2024-07-23 17:23:55.042724] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x221d500 00:29:59.695 [2024-07-23 17:23:55.042868] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x221d530 00:29:59.695 [2024-07-23 17:23:55.042878] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x221d530 00:29:59.695 [2024-07-23 17:23:55.042973] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:59.695 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:59.954 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:59.954 "name": "raid_bdev1", 00:29:59.954 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:29:59.954 "strip_size_kb": 0, 00:29:59.954 "state": "online", 00:29:59.954 "raid_level": "raid1", 00:29:59.954 "superblock": true, 00:29:59.954 "num_base_bdevs": 4, 00:29:59.954 "num_base_bdevs_discovered": 4, 00:29:59.954 "num_base_bdevs_operational": 4, 00:29:59.954 "base_bdevs_list": [ 00:29:59.954 { 00:29:59.954 "name": "BaseBdev1", 00:29:59.954 "uuid": "c7683433-575f-59b6-aa03-96997809968c", 00:29:59.954 "is_configured": true, 00:29:59.954 "data_offset": 2048, 00:29:59.954 "data_size": 63488 00:29:59.954 }, 00:29:59.954 { 00:29:59.954 "name": "BaseBdev2", 00:29:59.954 "uuid": "e1ef2ab1-b81a-54ff-9294-01743655f17b", 00:29:59.954 "is_configured": true, 00:29:59.954 "data_offset": 2048, 00:29:59.954 "data_size": 63488 00:29:59.954 }, 00:29:59.954 { 00:29:59.954 "name": "BaseBdev3", 00:29:59.954 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:29:59.954 "is_configured": true, 00:29:59.954 "data_offset": 2048, 00:29:59.954 "data_size": 63488 00:29:59.954 }, 00:29:59.954 { 00:29:59.954 "name": "BaseBdev4", 00:29:59.954 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:29:59.955 "is_configured": true, 00:29:59.955 "data_offset": 2048, 00:29:59.955 "data_size": 63488 00:29:59.955 } 00:29:59.955 ] 00:29:59.955 }' 00:29:59.955 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:59.955 17:23:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:00.521 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:00.521 17:23:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:30:00.779 [2024-07-23 17:23:56.036041] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:00.779 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:30:00.779 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.779 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:01.101 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:30:01.360 [2024-07-23 17:23:56.541150] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d3680 00:30:01.360 /dev/nbd0 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:01.360 1+0 records in 00:30:01.360 1+0 records out 00:30:01.360 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263821 s, 15.5 MB/s 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:30:01.360 17:23:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:30:09.476 63488+0 records in 00:30:09.476 63488+0 records out 00:30:09.476 32505856 bytes (33 MB, 31 MiB) copied, 7.37335 s, 4.4 MB/s 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:09.476 17:24:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:09.476 [2024-07-23 17:24:04.180045] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:09.476 [2024-07-23 17:24:04.344504] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.476 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:09.476 "name": "raid_bdev1", 00:30:09.476 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:09.476 "strip_size_kb": 0, 00:30:09.476 "state": "online", 00:30:09.476 "raid_level": "raid1", 00:30:09.476 "superblock": true, 00:30:09.476 "num_base_bdevs": 4, 00:30:09.476 "num_base_bdevs_discovered": 3, 00:30:09.476 "num_base_bdevs_operational": 3, 00:30:09.476 "base_bdevs_list": [ 00:30:09.476 { 00:30:09.476 "name": null, 00:30:09.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:09.476 "is_configured": false, 00:30:09.476 "data_offset": 2048, 00:30:09.476 "data_size": 63488 00:30:09.476 }, 00:30:09.476 { 00:30:09.476 "name": "BaseBdev2", 00:30:09.476 "uuid": "e1ef2ab1-b81a-54ff-9294-01743655f17b", 00:30:09.476 "is_configured": true, 00:30:09.476 "data_offset": 2048, 00:30:09.476 "data_size": 63488 00:30:09.476 }, 00:30:09.476 { 00:30:09.476 "name": "BaseBdev3", 00:30:09.476 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:09.476 "is_configured": true, 00:30:09.476 "data_offset": 2048, 00:30:09.477 "data_size": 63488 00:30:09.477 }, 00:30:09.477 { 00:30:09.477 "name": "BaseBdev4", 00:30:09.477 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:09.477 "is_configured": true, 00:30:09.477 "data_offset": 2048, 00:30:09.477 "data_size": 63488 00:30:09.477 } 00:30:09.477 ] 00:30:09.477 }' 00:30:09.477 17:24:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:09.477 17:24:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:09.734 17:24:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:09.993 [2024-07-23 17:24:05.311217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:09.993 [2024-07-23 17:24:05.315260] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d3680 00:30:09.993 [2024-07-23 17:24:05.317627] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:09.993 17:24:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.928 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.186 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:11.186 "name": "raid_bdev1", 00:30:11.186 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:11.186 "strip_size_kb": 0, 00:30:11.186 "state": "online", 00:30:11.186 "raid_level": "raid1", 00:30:11.186 "superblock": true, 00:30:11.186 "num_base_bdevs": 4, 00:30:11.186 "num_base_bdevs_discovered": 4, 00:30:11.186 "num_base_bdevs_operational": 4, 00:30:11.186 "process": { 00:30:11.186 "type": "rebuild", 00:30:11.186 "target": "spare", 00:30:11.186 "progress": { 00:30:11.186 "blocks": 22528, 00:30:11.186 "percent": 35 00:30:11.186 } 00:30:11.186 }, 00:30:11.186 "base_bdevs_list": [ 00:30:11.186 { 00:30:11.186 "name": "spare", 00:30:11.186 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:11.186 "is_configured": true, 00:30:11.186 "data_offset": 2048, 00:30:11.186 "data_size": 63488 00:30:11.186 }, 00:30:11.186 { 00:30:11.186 "name": "BaseBdev2", 00:30:11.186 "uuid": "e1ef2ab1-b81a-54ff-9294-01743655f17b", 00:30:11.186 "is_configured": true, 00:30:11.186 "data_offset": 2048, 00:30:11.186 "data_size": 63488 00:30:11.186 }, 00:30:11.186 { 00:30:11.186 "name": "BaseBdev3", 00:30:11.186 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:11.186 "is_configured": true, 00:30:11.186 "data_offset": 2048, 00:30:11.186 "data_size": 63488 00:30:11.186 }, 00:30:11.186 { 00:30:11.186 "name": "BaseBdev4", 00:30:11.186 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:11.186 "is_configured": true, 00:30:11.186 "data_offset": 2048, 00:30:11.186 "data_size": 63488 00:30:11.186 } 00:30:11.186 ] 00:30:11.186 }' 00:30:11.186 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:11.186 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:11.186 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:11.445 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:11.445 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:11.445 [2024-07-23 17:24:06.848522] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:11.703 [2024-07-23 17:24:06.930119] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:11.704 [2024-07-23 17:24:06.930162] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:11.704 [2024-07-23 17:24:06.930179] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:11.704 [2024-07-23 17:24:06.930188] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.704 17:24:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.962 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:11.963 "name": "raid_bdev1", 00:30:11.963 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:11.963 "strip_size_kb": 0, 00:30:11.963 "state": "online", 00:30:11.963 "raid_level": "raid1", 00:30:11.963 "superblock": true, 00:30:11.963 "num_base_bdevs": 4, 00:30:11.963 "num_base_bdevs_discovered": 3, 00:30:11.963 "num_base_bdevs_operational": 3, 00:30:11.963 "base_bdevs_list": [ 00:30:11.963 { 00:30:11.963 "name": null, 00:30:11.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:11.963 "is_configured": false, 00:30:11.963 "data_offset": 2048, 00:30:11.963 "data_size": 63488 00:30:11.963 }, 00:30:11.963 { 00:30:11.963 "name": "BaseBdev2", 00:30:11.963 "uuid": "e1ef2ab1-b81a-54ff-9294-01743655f17b", 00:30:11.963 "is_configured": true, 00:30:11.963 "data_offset": 2048, 00:30:11.963 "data_size": 63488 00:30:11.963 }, 00:30:11.963 { 00:30:11.963 "name": "BaseBdev3", 00:30:11.963 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:11.963 "is_configured": true, 00:30:11.963 "data_offset": 2048, 00:30:11.963 "data_size": 63488 00:30:11.963 }, 00:30:11.963 { 00:30:11.963 "name": "BaseBdev4", 00:30:11.963 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:11.963 "is_configured": true, 00:30:11.963 "data_offset": 2048, 00:30:11.963 "data_size": 63488 00:30:11.963 } 00:30:11.963 ] 00:30:11.963 }' 00:30:11.963 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:11.963 17:24:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.529 17:24:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.788 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:12.788 "name": "raid_bdev1", 00:30:12.788 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:12.788 "strip_size_kb": 0, 00:30:12.788 "state": "online", 00:30:12.788 "raid_level": "raid1", 00:30:12.788 "superblock": true, 00:30:12.788 "num_base_bdevs": 4, 00:30:12.788 "num_base_bdevs_discovered": 3, 00:30:12.788 "num_base_bdevs_operational": 3, 00:30:12.788 "base_bdevs_list": [ 00:30:12.788 { 00:30:12.788 "name": null, 00:30:12.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:12.788 "is_configured": false, 00:30:12.788 "data_offset": 2048, 00:30:12.788 "data_size": 63488 00:30:12.788 }, 00:30:12.788 { 00:30:12.788 "name": "BaseBdev2", 00:30:12.788 "uuid": "e1ef2ab1-b81a-54ff-9294-01743655f17b", 00:30:12.788 "is_configured": true, 00:30:12.788 "data_offset": 2048, 00:30:12.788 "data_size": 63488 00:30:12.788 }, 00:30:12.788 { 00:30:12.788 "name": "BaseBdev3", 00:30:12.788 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:12.788 "is_configured": true, 00:30:12.788 "data_offset": 2048, 00:30:12.788 "data_size": 63488 00:30:12.788 }, 00:30:12.788 { 00:30:12.788 "name": "BaseBdev4", 00:30:12.788 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:12.788 "is_configured": true, 00:30:12.788 "data_offset": 2048, 00:30:12.788 "data_size": 63488 00:30:12.788 } 00:30:12.788 ] 00:30:12.788 }' 00:30:12.788 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:12.788 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:12.788 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:12.788 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:12.788 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:13.046 [2024-07-23 17:24:08.349864] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:13.046 [2024-07-23 17:24:08.353904] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x221d500 00:30:13.046 [2024-07-23 17:24:08.355526] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:13.046 17:24:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.980 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:14.239 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:14.239 "name": "raid_bdev1", 00:30:14.239 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:14.239 "strip_size_kb": 0, 00:30:14.239 "state": "online", 00:30:14.239 "raid_level": "raid1", 00:30:14.239 "superblock": true, 00:30:14.239 "num_base_bdevs": 4, 00:30:14.239 "num_base_bdevs_discovered": 4, 00:30:14.239 "num_base_bdevs_operational": 4, 00:30:14.239 "process": { 00:30:14.239 "type": "rebuild", 00:30:14.239 "target": "spare", 00:30:14.239 "progress": { 00:30:14.239 "blocks": 24576, 00:30:14.239 "percent": 38 00:30:14.239 } 00:30:14.239 }, 00:30:14.239 "base_bdevs_list": [ 00:30:14.239 { 00:30:14.239 "name": "spare", 00:30:14.239 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:14.239 "is_configured": true, 00:30:14.239 "data_offset": 2048, 00:30:14.239 "data_size": 63488 00:30:14.239 }, 00:30:14.239 { 00:30:14.239 "name": "BaseBdev2", 00:30:14.239 "uuid": "e1ef2ab1-b81a-54ff-9294-01743655f17b", 00:30:14.239 "is_configured": true, 00:30:14.239 "data_offset": 2048, 00:30:14.239 "data_size": 63488 00:30:14.239 }, 00:30:14.239 { 00:30:14.239 "name": "BaseBdev3", 00:30:14.239 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:14.239 "is_configured": true, 00:30:14.239 "data_offset": 2048, 00:30:14.239 "data_size": 63488 00:30:14.239 }, 00:30:14.239 { 00:30:14.239 "name": "BaseBdev4", 00:30:14.239 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:14.239 "is_configured": true, 00:30:14.239 "data_offset": 2048, 00:30:14.239 "data_size": 63488 00:30:14.239 } 00:30:14.239 ] 00:30:14.239 }' 00:30:14.239 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:30:14.498 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:30:14.498 17:24:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:30:14.757 [2024-07-23 17:24:09.943203] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:14.757 [2024-07-23 17:24:10.068425] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x221d500 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:14.757 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.015 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:15.015 "name": "raid_bdev1", 00:30:15.015 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:15.015 "strip_size_kb": 0, 00:30:15.015 "state": "online", 00:30:15.015 "raid_level": "raid1", 00:30:15.015 "superblock": true, 00:30:15.015 "num_base_bdevs": 4, 00:30:15.015 "num_base_bdevs_discovered": 3, 00:30:15.015 "num_base_bdevs_operational": 3, 00:30:15.015 "process": { 00:30:15.015 "type": "rebuild", 00:30:15.015 "target": "spare", 00:30:15.015 "progress": { 00:30:15.015 "blocks": 34816, 00:30:15.015 "percent": 54 00:30:15.015 } 00:30:15.015 }, 00:30:15.015 "base_bdevs_list": [ 00:30:15.015 { 00:30:15.015 "name": "spare", 00:30:15.015 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:15.015 "is_configured": true, 00:30:15.015 "data_offset": 2048, 00:30:15.015 "data_size": 63488 00:30:15.015 }, 00:30:15.015 { 00:30:15.015 "name": null, 00:30:15.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:15.015 "is_configured": false, 00:30:15.015 "data_offset": 2048, 00:30:15.015 "data_size": 63488 00:30:15.015 }, 00:30:15.015 { 00:30:15.015 "name": "BaseBdev3", 00:30:15.015 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:15.015 "is_configured": true, 00:30:15.015 "data_offset": 2048, 00:30:15.015 "data_size": 63488 00:30:15.015 }, 00:30:15.015 { 00:30:15.015 "name": "BaseBdev4", 00:30:15.015 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:15.015 "is_configured": true, 00:30:15.016 "data_offset": 2048, 00:30:15.016 "data_size": 63488 00:30:15.016 } 00:30:15.016 ] 00:30:15.016 }' 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=950 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:15.016 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.274 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:15.274 "name": "raid_bdev1", 00:30:15.274 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:15.274 "strip_size_kb": 0, 00:30:15.274 "state": "online", 00:30:15.274 "raid_level": "raid1", 00:30:15.274 "superblock": true, 00:30:15.274 "num_base_bdevs": 4, 00:30:15.274 "num_base_bdevs_discovered": 3, 00:30:15.274 "num_base_bdevs_operational": 3, 00:30:15.274 "process": { 00:30:15.274 "type": "rebuild", 00:30:15.274 "target": "spare", 00:30:15.274 "progress": { 00:30:15.274 "blocks": 43008, 00:30:15.274 "percent": 67 00:30:15.274 } 00:30:15.274 }, 00:30:15.274 "base_bdevs_list": [ 00:30:15.274 { 00:30:15.274 "name": "spare", 00:30:15.274 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:15.274 "is_configured": true, 00:30:15.274 "data_offset": 2048, 00:30:15.274 "data_size": 63488 00:30:15.274 }, 00:30:15.274 { 00:30:15.274 "name": null, 00:30:15.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:15.274 "is_configured": false, 00:30:15.274 "data_offset": 2048, 00:30:15.274 "data_size": 63488 00:30:15.274 }, 00:30:15.274 { 00:30:15.274 "name": "BaseBdev3", 00:30:15.274 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:15.274 "is_configured": true, 00:30:15.274 "data_offset": 2048, 00:30:15.274 "data_size": 63488 00:30:15.274 }, 00:30:15.274 { 00:30:15.274 "name": "BaseBdev4", 00:30:15.274 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:15.274 "is_configured": true, 00:30:15.274 "data_offset": 2048, 00:30:15.274 "data_size": 63488 00:30:15.274 } 00:30:15.274 ] 00:30:15.274 }' 00:30:15.274 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:15.274 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:15.274 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:15.532 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:15.532 17:24:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:16.469 [2024-07-23 17:24:11.580200] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:16.469 [2024-07-23 17:24:11.580263] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:16.469 [2024-07-23 17:24:11.580370] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:16.469 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:16.727 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:16.727 "name": "raid_bdev1", 00:30:16.727 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:16.727 "strip_size_kb": 0, 00:30:16.727 "state": "online", 00:30:16.728 "raid_level": "raid1", 00:30:16.728 "superblock": true, 00:30:16.728 "num_base_bdevs": 4, 00:30:16.728 "num_base_bdevs_discovered": 3, 00:30:16.728 "num_base_bdevs_operational": 3, 00:30:16.728 "base_bdevs_list": [ 00:30:16.728 { 00:30:16.728 "name": "spare", 00:30:16.728 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:16.728 "is_configured": true, 00:30:16.728 "data_offset": 2048, 00:30:16.728 "data_size": 63488 00:30:16.728 }, 00:30:16.728 { 00:30:16.728 "name": null, 00:30:16.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:16.728 "is_configured": false, 00:30:16.728 "data_offset": 2048, 00:30:16.728 "data_size": 63488 00:30:16.728 }, 00:30:16.728 { 00:30:16.728 "name": "BaseBdev3", 00:30:16.728 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:16.728 "is_configured": true, 00:30:16.728 "data_offset": 2048, 00:30:16.728 "data_size": 63488 00:30:16.728 }, 00:30:16.728 { 00:30:16.728 "name": "BaseBdev4", 00:30:16.728 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:16.728 "is_configured": true, 00:30:16.728 "data_offset": 2048, 00:30:16.728 "data_size": 63488 00:30:16.728 } 00:30:16.728 ] 00:30:16.728 }' 00:30:16.728 17:24:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:16.728 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:17.358 "name": "raid_bdev1", 00:30:17.358 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:17.358 "strip_size_kb": 0, 00:30:17.358 "state": "online", 00:30:17.358 "raid_level": "raid1", 00:30:17.358 "superblock": true, 00:30:17.358 "num_base_bdevs": 4, 00:30:17.358 "num_base_bdevs_discovered": 3, 00:30:17.358 "num_base_bdevs_operational": 3, 00:30:17.358 "base_bdevs_list": [ 00:30:17.358 { 00:30:17.358 "name": "spare", 00:30:17.358 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:17.358 "is_configured": true, 00:30:17.358 "data_offset": 2048, 00:30:17.358 "data_size": 63488 00:30:17.358 }, 00:30:17.358 { 00:30:17.358 "name": null, 00:30:17.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:17.358 "is_configured": false, 00:30:17.358 "data_offset": 2048, 00:30:17.358 "data_size": 63488 00:30:17.358 }, 00:30:17.358 { 00:30:17.358 "name": "BaseBdev3", 00:30:17.358 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:17.358 "is_configured": true, 00:30:17.358 "data_offset": 2048, 00:30:17.358 "data_size": 63488 00:30:17.358 }, 00:30:17.358 { 00:30:17.358 "name": "BaseBdev4", 00:30:17.358 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:17.358 "is_configured": true, 00:30:17.358 "data_offset": 2048, 00:30:17.358 "data_size": 63488 00:30:17.358 } 00:30:17.358 ] 00:30:17.358 }' 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:17.358 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:17.616 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:17.616 "name": "raid_bdev1", 00:30:17.616 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:17.616 "strip_size_kb": 0, 00:30:17.616 "state": "online", 00:30:17.616 "raid_level": "raid1", 00:30:17.616 "superblock": true, 00:30:17.616 "num_base_bdevs": 4, 00:30:17.616 "num_base_bdevs_discovered": 3, 00:30:17.616 "num_base_bdevs_operational": 3, 00:30:17.616 "base_bdevs_list": [ 00:30:17.616 { 00:30:17.616 "name": "spare", 00:30:17.616 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:17.616 "is_configured": true, 00:30:17.616 "data_offset": 2048, 00:30:17.616 "data_size": 63488 00:30:17.616 }, 00:30:17.616 { 00:30:17.616 "name": null, 00:30:17.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:17.616 "is_configured": false, 00:30:17.616 "data_offset": 2048, 00:30:17.616 "data_size": 63488 00:30:17.616 }, 00:30:17.616 { 00:30:17.616 "name": "BaseBdev3", 00:30:17.616 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:17.616 "is_configured": true, 00:30:17.616 "data_offset": 2048, 00:30:17.616 "data_size": 63488 00:30:17.616 }, 00:30:17.616 { 00:30:17.616 "name": "BaseBdev4", 00:30:17.616 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:17.616 "is_configured": true, 00:30:17.616 "data_offset": 2048, 00:30:17.616 "data_size": 63488 00:30:17.616 } 00:30:17.616 ] 00:30:17.616 }' 00:30:17.617 17:24:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:17.617 17:24:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:18.183 17:24:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:18.441 [2024-07-23 17:24:13.774713] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:18.441 [2024-07-23 17:24:13.774744] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:18.441 [2024-07-23 17:24:13.774801] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:18.441 [2024-07-23 17:24:13.774868] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:18.441 [2024-07-23 17:24:13.774880] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x221d530 name raid_bdev1, state offline 00:30:18.441 17:24:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:30:18.441 17:24:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:18.699 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:30:19.266 /dev/nbd0 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:19.266 1+0 records in 00:30:19.266 1+0 records out 00:30:19.266 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270182 s, 15.2 MB/s 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:19.266 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:30:19.525 /dev/nbd1 00:30:19.525 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:19.525 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:19.525 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:19.525 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:30:19.525 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:19.526 1+0 records in 00:30:19.526 1+0 records out 00:30:19.526 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325389 s, 12.6 MB/s 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:19.526 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:19.785 17:24:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:20.043 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:30:20.301 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:20.559 17:24:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:20.818 [2024-07-23 17:24:15.991661] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:20.818 [2024-07-23 17:24:15.991707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:20.818 [2024-07-23 17:24:15.991730] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d2ef0 00:30:20.818 [2024-07-23 17:24:15.991743] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:20.818 [2024-07-23 17:24:15.993376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:20.818 [2024-07-23 17:24:15.993407] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:20.818 [2024-07-23 17:24:15.993488] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:20.818 [2024-07-23 17:24:15.993515] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:20.819 [2024-07-23 17:24:15.993620] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:20.819 [2024-07-23 17:24:15.993692] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:20.819 spare 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:20.819 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:20.819 [2024-07-23 17:24:16.094008] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x221b680 00:30:20.819 [2024-07-23 17:24:16.094027] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:30:20.819 [2024-07-23 17:24:16.094223] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x221d500 00:30:20.819 [2024-07-23 17:24:16.094377] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x221b680 00:30:20.819 [2024-07-23 17:24:16.094387] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x221b680 00:30:20.819 [2024-07-23 17:24:16.094491] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:21.077 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:21.077 "name": "raid_bdev1", 00:30:21.077 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:21.077 "strip_size_kb": 0, 00:30:21.077 "state": "online", 00:30:21.077 "raid_level": "raid1", 00:30:21.077 "superblock": true, 00:30:21.077 "num_base_bdevs": 4, 00:30:21.077 "num_base_bdevs_discovered": 3, 00:30:21.077 "num_base_bdevs_operational": 3, 00:30:21.077 "base_bdevs_list": [ 00:30:21.077 { 00:30:21.077 "name": "spare", 00:30:21.077 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:21.077 "is_configured": true, 00:30:21.077 "data_offset": 2048, 00:30:21.077 "data_size": 63488 00:30:21.077 }, 00:30:21.077 { 00:30:21.077 "name": null, 00:30:21.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.077 "is_configured": false, 00:30:21.077 "data_offset": 2048, 00:30:21.077 "data_size": 63488 00:30:21.077 }, 00:30:21.077 { 00:30:21.077 "name": "BaseBdev3", 00:30:21.077 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:21.077 "is_configured": true, 00:30:21.077 "data_offset": 2048, 00:30:21.077 "data_size": 63488 00:30:21.077 }, 00:30:21.077 { 00:30:21.077 "name": "BaseBdev4", 00:30:21.077 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:21.077 "is_configured": true, 00:30:21.077 "data_offset": 2048, 00:30:21.077 "data_size": 63488 00:30:21.077 } 00:30:21.077 ] 00:30:21.077 }' 00:30:21.077 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:21.077 17:24:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.642 17:24:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.642 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:21.642 "name": "raid_bdev1", 00:30:21.642 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:21.642 "strip_size_kb": 0, 00:30:21.642 "state": "online", 00:30:21.642 "raid_level": "raid1", 00:30:21.642 "superblock": true, 00:30:21.642 "num_base_bdevs": 4, 00:30:21.642 "num_base_bdevs_discovered": 3, 00:30:21.642 "num_base_bdevs_operational": 3, 00:30:21.642 "base_bdevs_list": [ 00:30:21.642 { 00:30:21.642 "name": "spare", 00:30:21.642 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:21.642 "is_configured": true, 00:30:21.642 "data_offset": 2048, 00:30:21.642 "data_size": 63488 00:30:21.642 }, 00:30:21.642 { 00:30:21.642 "name": null, 00:30:21.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.642 "is_configured": false, 00:30:21.642 "data_offset": 2048, 00:30:21.642 "data_size": 63488 00:30:21.642 }, 00:30:21.642 { 00:30:21.642 "name": "BaseBdev3", 00:30:21.642 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:21.642 "is_configured": true, 00:30:21.642 "data_offset": 2048, 00:30:21.642 "data_size": 63488 00:30:21.642 }, 00:30:21.642 { 00:30:21.642 "name": "BaseBdev4", 00:30:21.642 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:21.642 "is_configured": true, 00:30:21.642 "data_offset": 2048, 00:30:21.642 "data_size": 63488 00:30:21.642 } 00:30:21.642 ] 00:30:21.642 }' 00:30:21.642 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:30:21.901 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:22.160 [2024-07-23 17:24:17.535881] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:22.160 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:22.419 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:22.419 "name": "raid_bdev1", 00:30:22.419 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:22.419 "strip_size_kb": 0, 00:30:22.419 "state": "online", 00:30:22.419 "raid_level": "raid1", 00:30:22.419 "superblock": true, 00:30:22.419 "num_base_bdevs": 4, 00:30:22.419 "num_base_bdevs_discovered": 2, 00:30:22.419 "num_base_bdevs_operational": 2, 00:30:22.419 "base_bdevs_list": [ 00:30:22.419 { 00:30:22.419 "name": null, 00:30:22.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:22.420 "is_configured": false, 00:30:22.420 "data_offset": 2048, 00:30:22.420 "data_size": 63488 00:30:22.420 }, 00:30:22.420 { 00:30:22.420 "name": null, 00:30:22.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:22.420 "is_configured": false, 00:30:22.420 "data_offset": 2048, 00:30:22.420 "data_size": 63488 00:30:22.420 }, 00:30:22.420 { 00:30:22.420 "name": "BaseBdev3", 00:30:22.420 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:22.420 "is_configured": true, 00:30:22.420 "data_offset": 2048, 00:30:22.420 "data_size": 63488 00:30:22.420 }, 00:30:22.420 { 00:30:22.420 "name": "BaseBdev4", 00:30:22.420 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:22.420 "is_configured": true, 00:30:22.420 "data_offset": 2048, 00:30:22.420 "data_size": 63488 00:30:22.420 } 00:30:22.420 ] 00:30:22.420 }' 00:30:22.420 17:24:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:22.420 17:24:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:23.355 17:24:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:23.355 [2024-07-23 17:24:18.646835] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:23.355 [2024-07-23 17:24:18.646988] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:30:23.355 [2024-07-23 17:24:18.647005] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:23.355 [2024-07-23 17:24:18.647039] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:23.355 [2024-07-23 17:24:18.650933] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d0e80 00:30:23.355 [2024-07-23 17:24:18.652337] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:23.355 17:24:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:24.291 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:24.550 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:24.550 "name": "raid_bdev1", 00:30:24.550 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:24.550 "strip_size_kb": 0, 00:30:24.550 "state": "online", 00:30:24.550 "raid_level": "raid1", 00:30:24.550 "superblock": true, 00:30:24.550 "num_base_bdevs": 4, 00:30:24.550 "num_base_bdevs_discovered": 3, 00:30:24.550 "num_base_bdevs_operational": 3, 00:30:24.550 "process": { 00:30:24.550 "type": "rebuild", 00:30:24.550 "target": "spare", 00:30:24.550 "progress": { 00:30:24.550 "blocks": 24576, 00:30:24.550 "percent": 38 00:30:24.550 } 00:30:24.550 }, 00:30:24.550 "base_bdevs_list": [ 00:30:24.550 { 00:30:24.550 "name": "spare", 00:30:24.550 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:24.550 "is_configured": true, 00:30:24.550 "data_offset": 2048, 00:30:24.550 "data_size": 63488 00:30:24.550 }, 00:30:24.550 { 00:30:24.550 "name": null, 00:30:24.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:24.550 "is_configured": false, 00:30:24.550 "data_offset": 2048, 00:30:24.550 "data_size": 63488 00:30:24.550 }, 00:30:24.550 { 00:30:24.550 "name": "BaseBdev3", 00:30:24.550 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:24.550 "is_configured": true, 00:30:24.550 "data_offset": 2048, 00:30:24.550 "data_size": 63488 00:30:24.550 }, 00:30:24.550 { 00:30:24.550 "name": "BaseBdev4", 00:30:24.550 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:24.550 "is_configured": true, 00:30:24.550 "data_offset": 2048, 00:30:24.550 "data_size": 63488 00:30:24.550 } 00:30:24.550 ] 00:30:24.550 }' 00:30:24.550 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:24.809 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:24.809 17:24:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:24.809 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:24.809 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:25.068 [2024-07-23 17:24:20.252469] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:25.068 [2024-07-23 17:24:20.264445] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:25.068 [2024-07-23 17:24:20.264488] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:25.068 [2024-07-23 17:24:20.264504] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:25.068 [2024-07-23 17:24:20.264513] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:25.068 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.069 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:25.328 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:25.328 "name": "raid_bdev1", 00:30:25.328 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:25.328 "strip_size_kb": 0, 00:30:25.328 "state": "online", 00:30:25.328 "raid_level": "raid1", 00:30:25.328 "superblock": true, 00:30:25.328 "num_base_bdevs": 4, 00:30:25.328 "num_base_bdevs_discovered": 2, 00:30:25.328 "num_base_bdevs_operational": 2, 00:30:25.328 "base_bdevs_list": [ 00:30:25.328 { 00:30:25.328 "name": null, 00:30:25.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:25.328 "is_configured": false, 00:30:25.328 "data_offset": 2048, 00:30:25.328 "data_size": 63488 00:30:25.328 }, 00:30:25.328 { 00:30:25.328 "name": null, 00:30:25.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:25.328 "is_configured": false, 00:30:25.328 "data_offset": 2048, 00:30:25.328 "data_size": 63488 00:30:25.328 }, 00:30:25.328 { 00:30:25.328 "name": "BaseBdev3", 00:30:25.328 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:25.328 "is_configured": true, 00:30:25.328 "data_offset": 2048, 00:30:25.328 "data_size": 63488 00:30:25.328 }, 00:30:25.328 { 00:30:25.328 "name": "BaseBdev4", 00:30:25.328 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:25.328 "is_configured": true, 00:30:25.328 "data_offset": 2048, 00:30:25.328 "data_size": 63488 00:30:25.328 } 00:30:25.328 ] 00:30:25.328 }' 00:30:25.328 17:24:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:25.328 17:24:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:25.896 17:24:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:26.155 [2024-07-23 17:24:21.439598] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:26.155 [2024-07-23 17:24:21.439651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:26.155 [2024-07-23 17:24:21.439673] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ccd10 00:30:26.155 [2024-07-23 17:24:21.439686] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:26.155 [2024-07-23 17:24:21.440066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:26.155 [2024-07-23 17:24:21.440085] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:26.155 [2024-07-23 17:24:21.440167] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:26.155 [2024-07-23 17:24:21.440179] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:30:26.155 [2024-07-23 17:24:21.440190] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:26.155 [2024-07-23 17:24:21.440209] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:26.155 [2024-07-23 17:24:21.444103] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d1460 00:30:26.155 spare 00:30:26.155 [2024-07-23 17:24:21.445511] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:26.155 17:24:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.091 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:27.350 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:27.350 "name": "raid_bdev1", 00:30:27.350 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:27.350 "strip_size_kb": 0, 00:30:27.350 "state": "online", 00:30:27.350 "raid_level": "raid1", 00:30:27.350 "superblock": true, 00:30:27.350 "num_base_bdevs": 4, 00:30:27.350 "num_base_bdevs_discovered": 3, 00:30:27.350 "num_base_bdevs_operational": 3, 00:30:27.350 "process": { 00:30:27.350 "type": "rebuild", 00:30:27.350 "target": "spare", 00:30:27.350 "progress": { 00:30:27.350 "blocks": 24576, 00:30:27.350 "percent": 38 00:30:27.350 } 00:30:27.350 }, 00:30:27.350 "base_bdevs_list": [ 00:30:27.350 { 00:30:27.350 "name": "spare", 00:30:27.350 "uuid": "0c8e9283-3712-5e30-940d-688c9591546f", 00:30:27.350 "is_configured": true, 00:30:27.350 "data_offset": 2048, 00:30:27.350 "data_size": 63488 00:30:27.350 }, 00:30:27.350 { 00:30:27.350 "name": null, 00:30:27.350 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:27.350 "is_configured": false, 00:30:27.350 "data_offset": 2048, 00:30:27.350 "data_size": 63488 00:30:27.350 }, 00:30:27.350 { 00:30:27.350 "name": "BaseBdev3", 00:30:27.350 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:27.350 "is_configured": true, 00:30:27.350 "data_offset": 2048, 00:30:27.350 "data_size": 63488 00:30:27.350 }, 00:30:27.350 { 00:30:27.350 "name": "BaseBdev4", 00:30:27.350 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:27.350 "is_configured": true, 00:30:27.350 "data_offset": 2048, 00:30:27.350 "data_size": 63488 00:30:27.350 } 00:30:27.350 ] 00:30:27.350 }' 00:30:27.350 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:27.350 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:27.350 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:27.609 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:27.609 17:24:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:27.868 [2024-07-23 17:24:23.037235] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:27.868 [2024-07-23 17:24:23.058261] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:27.868 [2024-07-23 17:24:23.058303] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:27.868 [2024-07-23 17:24:23.058320] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:27.868 [2024-07-23 17:24:23.058328] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.868 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:28.129 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:28.129 "name": "raid_bdev1", 00:30:28.129 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:28.129 "strip_size_kb": 0, 00:30:28.129 "state": "online", 00:30:28.129 "raid_level": "raid1", 00:30:28.129 "superblock": true, 00:30:28.129 "num_base_bdevs": 4, 00:30:28.129 "num_base_bdevs_discovered": 2, 00:30:28.129 "num_base_bdevs_operational": 2, 00:30:28.129 "base_bdevs_list": [ 00:30:28.129 { 00:30:28.129 "name": null, 00:30:28.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:28.129 "is_configured": false, 00:30:28.129 "data_offset": 2048, 00:30:28.129 "data_size": 63488 00:30:28.129 }, 00:30:28.129 { 00:30:28.129 "name": null, 00:30:28.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:28.129 "is_configured": false, 00:30:28.129 "data_offset": 2048, 00:30:28.129 "data_size": 63488 00:30:28.129 }, 00:30:28.129 { 00:30:28.129 "name": "BaseBdev3", 00:30:28.129 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:28.129 "is_configured": true, 00:30:28.129 "data_offset": 2048, 00:30:28.129 "data_size": 63488 00:30:28.129 }, 00:30:28.129 { 00:30:28.129 "name": "BaseBdev4", 00:30:28.129 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:28.129 "is_configured": true, 00:30:28.129 "data_offset": 2048, 00:30:28.129 "data_size": 63488 00:30:28.129 } 00:30:28.129 ] 00:30:28.129 }' 00:30:28.129 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:28.129 17:24:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:28.698 17:24:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:28.957 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:28.957 "name": "raid_bdev1", 00:30:28.957 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:28.957 "strip_size_kb": 0, 00:30:28.957 "state": "online", 00:30:28.957 "raid_level": "raid1", 00:30:28.957 "superblock": true, 00:30:28.957 "num_base_bdevs": 4, 00:30:28.957 "num_base_bdevs_discovered": 2, 00:30:28.957 "num_base_bdevs_operational": 2, 00:30:28.957 "base_bdevs_list": [ 00:30:28.957 { 00:30:28.957 "name": null, 00:30:28.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:28.957 "is_configured": false, 00:30:28.957 "data_offset": 2048, 00:30:28.957 "data_size": 63488 00:30:28.957 }, 00:30:28.957 { 00:30:28.957 "name": null, 00:30:28.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:28.957 "is_configured": false, 00:30:28.957 "data_offset": 2048, 00:30:28.957 "data_size": 63488 00:30:28.957 }, 00:30:28.957 { 00:30:28.957 "name": "BaseBdev3", 00:30:28.957 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:28.957 "is_configured": true, 00:30:28.957 "data_offset": 2048, 00:30:28.957 "data_size": 63488 00:30:28.957 }, 00:30:28.957 { 00:30:28.957 "name": "BaseBdev4", 00:30:28.957 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:28.957 "is_configured": true, 00:30:28.957 "data_offset": 2048, 00:30:28.957 "data_size": 63488 00:30:28.957 } 00:30:28.957 ] 00:30:28.957 }' 00:30:28.957 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:28.957 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:28.957 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:28.957 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:28.957 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:29.215 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:29.473 [2024-07-23 17:24:24.803091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:29.473 [2024-07-23 17:24:24.803145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:29.473 [2024-07-23 17:24:24.803173] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22d31f0 00:30:29.473 [2024-07-23 17:24:24.803186] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:29.473 [2024-07-23 17:24:24.803544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:29.473 [2024-07-23 17:24:24.803562] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:29.473 [2024-07-23 17:24:24.803630] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:29.473 [2024-07-23 17:24:24.803643] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:30:29.473 [2024-07-23 17:24:24.803653] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:29.473 BaseBdev1 00:30:29.473 17:24:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:30.408 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:30.668 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:30.668 17:24:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:30.668 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:30.668 "name": "raid_bdev1", 00:30:30.668 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:30.668 "strip_size_kb": 0, 00:30:30.668 "state": "online", 00:30:30.668 "raid_level": "raid1", 00:30:30.668 "superblock": true, 00:30:30.668 "num_base_bdevs": 4, 00:30:30.668 "num_base_bdevs_discovered": 2, 00:30:30.668 "num_base_bdevs_operational": 2, 00:30:30.668 "base_bdevs_list": [ 00:30:30.668 { 00:30:30.668 "name": null, 00:30:30.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:30.668 "is_configured": false, 00:30:30.668 "data_offset": 2048, 00:30:30.668 "data_size": 63488 00:30:30.668 }, 00:30:30.668 { 00:30:30.668 "name": null, 00:30:30.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:30.668 "is_configured": false, 00:30:30.668 "data_offset": 2048, 00:30:30.668 "data_size": 63488 00:30:30.668 }, 00:30:30.668 { 00:30:30.668 "name": "BaseBdev3", 00:30:30.668 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:30.668 "is_configured": true, 00:30:30.668 "data_offset": 2048, 00:30:30.668 "data_size": 63488 00:30:30.668 }, 00:30:30.668 { 00:30:30.668 "name": "BaseBdev4", 00:30:30.668 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:30.668 "is_configured": true, 00:30:30.668 "data_offset": 2048, 00:30:30.668 "data_size": 63488 00:30:30.668 } 00:30:30.668 ] 00:30:30.668 }' 00:30:30.668 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:30.668 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:31.235 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:31.494 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:31.494 "name": "raid_bdev1", 00:30:31.494 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:31.494 "strip_size_kb": 0, 00:30:31.494 "state": "online", 00:30:31.494 "raid_level": "raid1", 00:30:31.494 "superblock": true, 00:30:31.494 "num_base_bdevs": 4, 00:30:31.494 "num_base_bdevs_discovered": 2, 00:30:31.494 "num_base_bdevs_operational": 2, 00:30:31.494 "base_bdevs_list": [ 00:30:31.494 { 00:30:31.494 "name": null, 00:30:31.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:31.494 "is_configured": false, 00:30:31.494 "data_offset": 2048, 00:30:31.494 "data_size": 63488 00:30:31.494 }, 00:30:31.494 { 00:30:31.494 "name": null, 00:30:31.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:31.494 "is_configured": false, 00:30:31.494 "data_offset": 2048, 00:30:31.494 "data_size": 63488 00:30:31.494 }, 00:30:31.494 { 00:30:31.494 "name": "BaseBdev3", 00:30:31.494 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:31.494 "is_configured": true, 00:30:31.494 "data_offset": 2048, 00:30:31.494 "data_size": 63488 00:30:31.494 }, 00:30:31.494 { 00:30:31.494 "name": "BaseBdev4", 00:30:31.494 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:31.494 "is_configured": true, 00:30:31.494 "data_offset": 2048, 00:30:31.494 "data_size": 63488 00:30:31.494 } 00:30:31.494 ] 00:30:31.494 }' 00:30:31.494 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:31.756 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:31.757 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:31.757 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:31.757 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:31.757 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:31.757 17:24:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:32.038 [2024-07-23 17:24:27.221519] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:32.038 [2024-07-23 17:24:27.221656] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:30:32.038 [2024-07-23 17:24:27.221671] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:32.038 request: 00:30:32.038 { 00:30:32.038 "base_bdev": "BaseBdev1", 00:30:32.038 "raid_bdev": "raid_bdev1", 00:30:32.038 "method": "bdev_raid_add_base_bdev", 00:30:32.038 "req_id": 1 00:30:32.038 } 00:30:32.038 Got JSON-RPC error response 00:30:32.038 response: 00:30:32.038 { 00:30:32.038 "code": -22, 00:30:32.038 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:32.038 } 00:30:32.038 17:24:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:30:32.038 17:24:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:32.038 17:24:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:32.038 17:24:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:32.038 17:24:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:32.974 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:33.233 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:33.233 "name": "raid_bdev1", 00:30:33.233 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:33.233 "strip_size_kb": 0, 00:30:33.233 "state": "online", 00:30:33.233 "raid_level": "raid1", 00:30:33.233 "superblock": true, 00:30:33.233 "num_base_bdevs": 4, 00:30:33.233 "num_base_bdevs_discovered": 2, 00:30:33.233 "num_base_bdevs_operational": 2, 00:30:33.233 "base_bdevs_list": [ 00:30:33.233 { 00:30:33.233 "name": null, 00:30:33.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:33.233 "is_configured": false, 00:30:33.233 "data_offset": 2048, 00:30:33.233 "data_size": 63488 00:30:33.233 }, 00:30:33.233 { 00:30:33.233 "name": null, 00:30:33.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:33.234 "is_configured": false, 00:30:33.234 "data_offset": 2048, 00:30:33.234 "data_size": 63488 00:30:33.234 }, 00:30:33.234 { 00:30:33.234 "name": "BaseBdev3", 00:30:33.234 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:33.234 "is_configured": true, 00:30:33.234 "data_offset": 2048, 00:30:33.234 "data_size": 63488 00:30:33.234 }, 00:30:33.234 { 00:30:33.234 "name": "BaseBdev4", 00:30:33.234 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:33.234 "is_configured": true, 00:30:33.234 "data_offset": 2048, 00:30:33.234 "data_size": 63488 00:30:33.234 } 00:30:33.234 ] 00:30:33.234 }' 00:30:33.234 17:24:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:33.234 17:24:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:33.801 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:34.060 "name": "raid_bdev1", 00:30:34.060 "uuid": "7320af82-b596-4eb4-9007-6e28ce83e3b2", 00:30:34.060 "strip_size_kb": 0, 00:30:34.060 "state": "online", 00:30:34.060 "raid_level": "raid1", 00:30:34.060 "superblock": true, 00:30:34.060 "num_base_bdevs": 4, 00:30:34.060 "num_base_bdevs_discovered": 2, 00:30:34.060 "num_base_bdevs_operational": 2, 00:30:34.060 "base_bdevs_list": [ 00:30:34.060 { 00:30:34.060 "name": null, 00:30:34.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:34.060 "is_configured": false, 00:30:34.060 "data_offset": 2048, 00:30:34.060 "data_size": 63488 00:30:34.060 }, 00:30:34.060 { 00:30:34.060 "name": null, 00:30:34.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:34.060 "is_configured": false, 00:30:34.060 "data_offset": 2048, 00:30:34.060 "data_size": 63488 00:30:34.060 }, 00:30:34.060 { 00:30:34.060 "name": "BaseBdev3", 00:30:34.060 "uuid": "db63d50b-9c30-512b-8424-6322f8d6fe47", 00:30:34.060 "is_configured": true, 00:30:34.060 "data_offset": 2048, 00:30:34.060 "data_size": 63488 00:30:34.060 }, 00:30:34.060 { 00:30:34.060 "name": "BaseBdev4", 00:30:34.060 "uuid": "8e08012d-c740-5a60-8428-c6d3dd0db139", 00:30:34.060 "is_configured": true, 00:30:34.060 "data_offset": 2048, 00:30:34.060 "data_size": 63488 00:30:34.060 } 00:30:34.060 ] 00:30:34.060 }' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 45504 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 45504 ']' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 45504 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 45504 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 45504' 00:30:34.060 killing process with pid 45504 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 45504 00:30:34.060 Received shutdown signal, test time was about 60.000000 seconds 00:30:34.060 00:30:34.060 Latency(us) 00:30:34.060 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:34.060 =================================================================================================================== 00:30:34.060 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:34.060 [2024-07-23 17:24:29.428999] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:34.060 [2024-07-23 17:24:29.429092] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:34.060 [2024-07-23 17:24:29.429152] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:34.060 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 45504 00:30:34.060 [2024-07-23 17:24:29.429165] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x221b680 name raid_bdev1, state offline 00:30:34.060 [2024-07-23 17:24:29.476778] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:34.319 17:24:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:30:34.319 00:30:34.319 real 0m37.891s 00:30:34.319 user 0m54.388s 00:30:34.319 sys 0m7.075s 00:30:34.319 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:34.319 17:24:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:30:34.319 ************************************ 00:30:34.319 END TEST raid_rebuild_test_sb 00:30:34.319 ************************************ 00:30:34.319 17:24:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:34.319 17:24:29 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:30:34.319 17:24:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:34.319 17:24:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:34.319 17:24:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:34.579 ************************************ 00:30:34.579 START TEST raid_rebuild_test_io 00:30:34.579 ************************************ 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=51283 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 51283 /var/tmp/spdk-raid.sock 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 51283 ']' 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:34.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:34.579 17:24:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:30:34.579 [2024-07-23 17:24:29.823354] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:30:34.579 [2024-07-23 17:24:29.823430] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid51283 ] 00:30:34.579 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:34.579 Zero copy mechanism will not be used. 00:30:34.579 [2024-07-23 17:24:29.955524] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:34.837 [2024-07-23 17:24:30.008583] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:34.837 [2024-07-23 17:24:30.075309] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:34.837 [2024-07-23 17:24:30.075347] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:35.404 17:24:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:35.404 17:24:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:30:35.404 17:24:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:35.404 17:24:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:30:35.664 BaseBdev1_malloc 00:30:35.664 17:24:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:35.923 [2024-07-23 17:24:31.224869] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:35.923 [2024-07-23 17:24:31.224918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:35.923 [2024-07-23 17:24:31.224940] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x218a170 00:30:35.923 [2024-07-23 17:24:31.224952] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:35.923 [2024-07-23 17:24:31.226449] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:35.923 [2024-07-23 17:24:31.226476] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:35.923 BaseBdev1 00:30:35.923 17:24:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:35.923 17:24:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:30:36.181 BaseBdev2_malloc 00:30:36.181 17:24:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:30:36.440 [2024-07-23 17:24:31.732115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:30:36.440 [2024-07-23 17:24:31.732161] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:36.440 [2024-07-23 17:24:31.732181] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2070680 00:30:36.440 [2024-07-23 17:24:31.732193] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:36.440 [2024-07-23 17:24:31.733695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:36.440 [2024-07-23 17:24:31.733722] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:36.440 BaseBdev2 00:30:36.440 17:24:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:36.440 17:24:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:30:36.698 BaseBdev3_malloc 00:30:36.698 17:24:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:30:36.957 [2024-07-23 17:24:32.217966] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:30:36.957 [2024-07-23 17:24:32.218010] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:36.957 [2024-07-23 17:24:32.218034] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20713d0 00:30:36.957 [2024-07-23 17:24:32.218046] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:36.957 [2024-07-23 17:24:32.219531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:36.957 [2024-07-23 17:24:32.219557] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:30:36.957 BaseBdev3 00:30:36.957 17:24:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:36.957 17:24:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:30:37.215 BaseBdev4_malloc 00:30:37.215 17:24:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:30:37.474 [2024-07-23 17:24:32.707820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:30:37.474 [2024-07-23 17:24:32.707863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:37.474 [2024-07-23 17:24:32.707883] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20752c0 00:30:37.474 [2024-07-23 17:24:32.707901] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:37.474 [2024-07-23 17:24:32.709451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:37.474 [2024-07-23 17:24:32.709478] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:30:37.474 BaseBdev4 00:30:37.474 17:24:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:30:37.732 spare_malloc 00:30:37.732 17:24:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:30:37.991 spare_delay 00:30:37.991 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:38.250 [2024-07-23 17:24:33.439527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:38.250 [2024-07-23 17:24:33.439569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:38.250 [2024-07-23 17:24:33.439589] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2073e80 00:30:38.250 [2024-07-23 17:24:33.439602] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:38.250 [2024-07-23 17:24:33.441149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:38.250 [2024-07-23 17:24:33.441176] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:38.250 spare 00:30:38.250 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:30:38.509 [2024-07-23 17:24:33.680212] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:38.509 [2024-07-23 17:24:33.681539] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:38.509 [2024-07-23 17:24:33.681596] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:38.509 [2024-07-23 17:24:33.681640] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:38.509 [2024-07-23 17:24:33.681719] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fd9530 00:30:38.509 [2024-07-23 17:24:33.681730] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:30:38.509 [2024-07-23 17:24:33.681963] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x208f680 00:30:38.509 [2024-07-23 17:24:33.682115] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fd9530 00:30:38.509 [2024-07-23 17:24:33.682125] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fd9530 00:30:38.509 [2024-07-23 17:24:33.682243] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.509 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:38.768 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:38.768 "name": "raid_bdev1", 00:30:38.768 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:38.768 "strip_size_kb": 0, 00:30:38.768 "state": "online", 00:30:38.768 "raid_level": "raid1", 00:30:38.768 "superblock": false, 00:30:38.768 "num_base_bdevs": 4, 00:30:38.768 "num_base_bdevs_discovered": 4, 00:30:38.768 "num_base_bdevs_operational": 4, 00:30:38.768 "base_bdevs_list": [ 00:30:38.768 { 00:30:38.768 "name": "BaseBdev1", 00:30:38.768 "uuid": "6f9ed9d3-8685-50ed-8758-cf071f5b5e82", 00:30:38.768 "is_configured": true, 00:30:38.768 "data_offset": 0, 00:30:38.768 "data_size": 65536 00:30:38.768 }, 00:30:38.768 { 00:30:38.768 "name": "BaseBdev2", 00:30:38.768 "uuid": "6abcf502-5ea1-5c9f-882d-9c345a0fe500", 00:30:38.768 "is_configured": true, 00:30:38.768 "data_offset": 0, 00:30:38.768 "data_size": 65536 00:30:38.768 }, 00:30:38.768 { 00:30:38.768 "name": "BaseBdev3", 00:30:38.768 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:38.768 "is_configured": true, 00:30:38.768 "data_offset": 0, 00:30:38.768 "data_size": 65536 00:30:38.768 }, 00:30:38.768 { 00:30:38.768 "name": "BaseBdev4", 00:30:38.768 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:38.768 "is_configured": true, 00:30:38.768 "data_offset": 0, 00:30:38.768 "data_size": 65536 00:30:38.768 } 00:30:38.768 ] 00:30:38.768 }' 00:30:38.768 17:24:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:38.768 17:24:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:30:39.336 17:24:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:30:39.336 17:24:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:39.336 [2024-07-23 17:24:34.755342] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:39.594 17:24:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:30:39.594 17:24:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.594 17:24:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:39.853 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:30:39.853 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:30:39.853 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:39.853 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:30:39.853 [2024-07-23 17:24:35.134160] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fd90d0 00:30:39.853 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:39.853 Zero copy mechanism will not be used. 00:30:39.853 Running I/O for 60 seconds... 00:30:39.853 [2024-07-23 17:24:35.251282] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:39.853 [2024-07-23 17:24:35.251465] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fd90d0 00:30:40.111 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:40.111 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:40.111 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:40.111 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:40.112 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:40.679 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:40.679 "name": "raid_bdev1", 00:30:40.679 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:40.679 "strip_size_kb": 0, 00:30:40.680 "state": "online", 00:30:40.680 "raid_level": "raid1", 00:30:40.680 "superblock": false, 00:30:40.680 "num_base_bdevs": 4, 00:30:40.680 "num_base_bdevs_discovered": 3, 00:30:40.680 "num_base_bdevs_operational": 3, 00:30:40.680 "base_bdevs_list": [ 00:30:40.680 { 00:30:40.680 "name": null, 00:30:40.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:40.680 "is_configured": false, 00:30:40.680 "data_offset": 0, 00:30:40.680 "data_size": 65536 00:30:40.680 }, 00:30:40.680 { 00:30:40.680 "name": "BaseBdev2", 00:30:40.680 "uuid": "6abcf502-5ea1-5c9f-882d-9c345a0fe500", 00:30:40.680 "is_configured": true, 00:30:40.680 "data_offset": 0, 00:30:40.680 "data_size": 65536 00:30:40.680 }, 00:30:40.680 { 00:30:40.680 "name": "BaseBdev3", 00:30:40.680 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:40.680 "is_configured": true, 00:30:40.680 "data_offset": 0, 00:30:40.680 "data_size": 65536 00:30:40.680 }, 00:30:40.680 { 00:30:40.680 "name": "BaseBdev4", 00:30:40.680 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:40.680 "is_configured": true, 00:30:40.680 "data_offset": 0, 00:30:40.680 "data_size": 65536 00:30:40.680 } 00:30:40.680 ] 00:30:40.680 }' 00:30:40.680 17:24:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:40.680 17:24:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:30:41.248 17:24:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:41.507 [2024-07-23 17:24:36.704220] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:41.507 [2024-07-23 17:24:36.777512] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207a840 00:30:41.507 [2024-07-23 17:24:36.779919] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:41.507 17:24:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:30:41.507 [2024-07-23 17:24:36.882529] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:30:41.507 [2024-07-23 17:24:36.882923] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:30:41.765 [2024-07-23 17:24:37.048859] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:30:42.333 [2024-07-23 17:24:37.545006] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:42.593 17:24:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:42.593 [2024-07-23 17:24:37.811063] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:30:42.593 [2024-07-23 17:24:37.934254] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:30:42.851 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:42.851 "name": "raid_bdev1", 00:30:42.851 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:42.851 "strip_size_kb": 0, 00:30:42.851 "state": "online", 00:30:42.851 "raid_level": "raid1", 00:30:42.851 "superblock": false, 00:30:42.851 "num_base_bdevs": 4, 00:30:42.852 "num_base_bdevs_discovered": 4, 00:30:42.852 "num_base_bdevs_operational": 4, 00:30:42.852 "process": { 00:30:42.852 "type": "rebuild", 00:30:42.852 "target": "spare", 00:30:42.852 "progress": { 00:30:42.852 "blocks": 16384, 00:30:42.852 "percent": 25 00:30:42.852 } 00:30:42.852 }, 00:30:42.852 "base_bdevs_list": [ 00:30:42.852 { 00:30:42.852 "name": "spare", 00:30:42.852 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:42.852 "is_configured": true, 00:30:42.852 "data_offset": 0, 00:30:42.852 "data_size": 65536 00:30:42.852 }, 00:30:42.852 { 00:30:42.852 "name": "BaseBdev2", 00:30:42.852 "uuid": "6abcf502-5ea1-5c9f-882d-9c345a0fe500", 00:30:42.852 "is_configured": true, 00:30:42.852 "data_offset": 0, 00:30:42.852 "data_size": 65536 00:30:42.852 }, 00:30:42.852 { 00:30:42.852 "name": "BaseBdev3", 00:30:42.852 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:42.852 "is_configured": true, 00:30:42.852 "data_offset": 0, 00:30:42.852 "data_size": 65536 00:30:42.852 }, 00:30:42.852 { 00:30:42.852 "name": "BaseBdev4", 00:30:42.852 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:42.852 "is_configured": true, 00:30:42.852 "data_offset": 0, 00:30:42.852 "data_size": 65536 00:30:42.852 } 00:30:42.852 ] 00:30:42.852 }' 00:30:42.852 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:42.852 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:42.852 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:42.852 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:42.852 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:42.852 [2024-07-23 17:24:38.216275] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:30:43.110 [2024-07-23 17:24:38.377915] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:43.110 [2024-07-23 17:24:38.448329] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:30:43.110 [2024-07-23 17:24:38.458099] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:43.110 [2024-07-23 17:24:38.469067] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:43.110 [2024-07-23 17:24:38.469098] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:43.110 [2024-07-23 17:24:38.469108] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:43.110 [2024-07-23 17:24:38.482544] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1fd90d0 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:43.110 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:43.674 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:43.674 "name": "raid_bdev1", 00:30:43.674 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:43.674 "strip_size_kb": 0, 00:30:43.674 "state": "online", 00:30:43.674 "raid_level": "raid1", 00:30:43.674 "superblock": false, 00:30:43.674 "num_base_bdevs": 4, 00:30:43.674 "num_base_bdevs_discovered": 3, 00:30:43.674 "num_base_bdevs_operational": 3, 00:30:43.674 "base_bdevs_list": [ 00:30:43.674 { 00:30:43.674 "name": null, 00:30:43.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.674 "is_configured": false, 00:30:43.674 "data_offset": 0, 00:30:43.674 "data_size": 65536 00:30:43.674 }, 00:30:43.674 { 00:30:43.674 "name": "BaseBdev2", 00:30:43.674 "uuid": "6abcf502-5ea1-5c9f-882d-9c345a0fe500", 00:30:43.674 "is_configured": true, 00:30:43.674 "data_offset": 0, 00:30:43.674 "data_size": 65536 00:30:43.674 }, 00:30:43.674 { 00:30:43.674 "name": "BaseBdev3", 00:30:43.674 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:43.674 "is_configured": true, 00:30:43.674 "data_offset": 0, 00:30:43.674 "data_size": 65536 00:30:43.674 }, 00:30:43.674 { 00:30:43.674 "name": "BaseBdev4", 00:30:43.674 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:43.674 "is_configured": true, 00:30:43.674 "data_offset": 0, 00:30:43.674 "data_size": 65536 00:30:43.674 } 00:30:43.674 ] 00:30:43.674 }' 00:30:43.674 17:24:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:43.674 17:24:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.237 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:44.495 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:44.495 "name": "raid_bdev1", 00:30:44.495 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:44.495 "strip_size_kb": 0, 00:30:44.495 "state": "online", 00:30:44.495 "raid_level": "raid1", 00:30:44.495 "superblock": false, 00:30:44.495 "num_base_bdevs": 4, 00:30:44.495 "num_base_bdevs_discovered": 3, 00:30:44.495 "num_base_bdevs_operational": 3, 00:30:44.495 "base_bdevs_list": [ 00:30:44.495 { 00:30:44.495 "name": null, 00:30:44.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:44.495 "is_configured": false, 00:30:44.495 "data_offset": 0, 00:30:44.495 "data_size": 65536 00:30:44.495 }, 00:30:44.495 { 00:30:44.495 "name": "BaseBdev2", 00:30:44.495 "uuid": "6abcf502-5ea1-5c9f-882d-9c345a0fe500", 00:30:44.495 "is_configured": true, 00:30:44.495 "data_offset": 0, 00:30:44.495 "data_size": 65536 00:30:44.495 }, 00:30:44.495 { 00:30:44.495 "name": "BaseBdev3", 00:30:44.495 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:44.495 "is_configured": true, 00:30:44.495 "data_offset": 0, 00:30:44.495 "data_size": 65536 00:30:44.495 }, 00:30:44.495 { 00:30:44.495 "name": "BaseBdev4", 00:30:44.495 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:44.495 "is_configured": true, 00:30:44.495 "data_offset": 0, 00:30:44.495 "data_size": 65536 00:30:44.495 } 00:30:44.495 ] 00:30:44.495 }' 00:30:44.495 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:44.495 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:44.495 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:44.495 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:44.495 17:24:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:44.754 [2024-07-23 17:24:40.056445] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:44.754 17:24:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:44.754 [2024-07-23 17:24:40.104330] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2187b90 00:30:44.754 [2024-07-23 17:24:40.105845] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:45.012 [2024-07-23 17:24:40.236523] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:30:45.012 [2024-07-23 17:24:40.237752] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:30:45.271 [2024-07-23 17:24:40.458234] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:30:45.271 [2024-07-23 17:24:40.458757] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:30:45.529 [2024-07-23 17:24:40.824609] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:30:45.787 [2024-07-23 17:24:41.057815] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:30:45.787 [2024-07-23 17:24:41.058099] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:45.787 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:46.046 "name": "raid_bdev1", 00:30:46.046 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:46.046 "strip_size_kb": 0, 00:30:46.046 "state": "online", 00:30:46.046 "raid_level": "raid1", 00:30:46.046 "superblock": false, 00:30:46.046 "num_base_bdevs": 4, 00:30:46.046 "num_base_bdevs_discovered": 4, 00:30:46.046 "num_base_bdevs_operational": 4, 00:30:46.046 "process": { 00:30:46.046 "type": "rebuild", 00:30:46.046 "target": "spare", 00:30:46.046 "progress": { 00:30:46.046 "blocks": 12288, 00:30:46.046 "percent": 18 00:30:46.046 } 00:30:46.046 }, 00:30:46.046 "base_bdevs_list": [ 00:30:46.046 { 00:30:46.046 "name": "spare", 00:30:46.046 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:46.046 "is_configured": true, 00:30:46.046 "data_offset": 0, 00:30:46.046 "data_size": 65536 00:30:46.046 }, 00:30:46.046 { 00:30:46.046 "name": "BaseBdev2", 00:30:46.046 "uuid": "6abcf502-5ea1-5c9f-882d-9c345a0fe500", 00:30:46.046 "is_configured": true, 00:30:46.046 "data_offset": 0, 00:30:46.046 "data_size": 65536 00:30:46.046 }, 00:30:46.046 { 00:30:46.046 "name": "BaseBdev3", 00:30:46.046 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:46.046 "is_configured": true, 00:30:46.046 "data_offset": 0, 00:30:46.046 "data_size": 65536 00:30:46.046 }, 00:30:46.046 { 00:30:46.046 "name": "BaseBdev4", 00:30:46.046 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:46.046 "is_configured": true, 00:30:46.046 "data_offset": 0, 00:30:46.046 "data_size": 65536 00:30:46.046 } 00:30:46.046 ] 00:30:46.046 }' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:30:46.046 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:30:46.350 [2024-07-23 17:24:41.534550] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:30:46.350 [2024-07-23 17:24:41.678435] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:46.609 [2024-07-23 17:24:41.767066] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1fd90d0 00:30:46.609 [2024-07-23 17:24:41.767094] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2187b90 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:46.609 17:24:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:46.609 [2024-07-23 17:24:41.889346] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:30:46.609 [2024-07-23 17:24:42.007806] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:30:46.609 [2024-07-23 17:24:42.008042] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:46.868 "name": "raid_bdev1", 00:30:46.868 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:46.868 "strip_size_kb": 0, 00:30:46.868 "state": "online", 00:30:46.868 "raid_level": "raid1", 00:30:46.868 "superblock": false, 00:30:46.868 "num_base_bdevs": 4, 00:30:46.868 "num_base_bdevs_discovered": 3, 00:30:46.868 "num_base_bdevs_operational": 3, 00:30:46.868 "process": { 00:30:46.868 "type": "rebuild", 00:30:46.868 "target": "spare", 00:30:46.868 "progress": { 00:30:46.868 "blocks": 22528, 00:30:46.868 "percent": 34 00:30:46.868 } 00:30:46.868 }, 00:30:46.868 "base_bdevs_list": [ 00:30:46.868 { 00:30:46.868 "name": "spare", 00:30:46.868 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:46.868 "is_configured": true, 00:30:46.868 "data_offset": 0, 00:30:46.868 "data_size": 65536 00:30:46.868 }, 00:30:46.868 { 00:30:46.868 "name": null, 00:30:46.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:46.868 "is_configured": false, 00:30:46.868 "data_offset": 0, 00:30:46.868 "data_size": 65536 00:30:46.868 }, 00:30:46.868 { 00:30:46.868 "name": "BaseBdev3", 00:30:46.868 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:46.868 "is_configured": true, 00:30:46.868 "data_offset": 0, 00:30:46.868 "data_size": 65536 00:30:46.868 }, 00:30:46.868 { 00:30:46.868 "name": "BaseBdev4", 00:30:46.868 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:46.868 "is_configured": true, 00:30:46.868 "data_offset": 0, 00:30:46.868 "data_size": 65536 00:30:46.868 } 00:30:46.868 ] 00:30:46.868 }' 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=982 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:46.868 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:47.127 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:47.127 "name": "raid_bdev1", 00:30:47.127 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:47.127 "strip_size_kb": 0, 00:30:47.127 "state": "online", 00:30:47.127 "raid_level": "raid1", 00:30:47.127 "superblock": false, 00:30:47.127 "num_base_bdevs": 4, 00:30:47.127 "num_base_bdevs_discovered": 3, 00:30:47.127 "num_base_bdevs_operational": 3, 00:30:47.127 "process": { 00:30:47.127 "type": "rebuild", 00:30:47.127 "target": "spare", 00:30:47.127 "progress": { 00:30:47.127 "blocks": 26624, 00:30:47.127 "percent": 40 00:30:47.127 } 00:30:47.127 }, 00:30:47.127 "base_bdevs_list": [ 00:30:47.127 { 00:30:47.127 "name": "spare", 00:30:47.127 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:47.127 "is_configured": true, 00:30:47.127 "data_offset": 0, 00:30:47.127 "data_size": 65536 00:30:47.127 }, 00:30:47.127 { 00:30:47.127 "name": null, 00:30:47.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:47.127 "is_configured": false, 00:30:47.127 "data_offset": 0, 00:30:47.127 "data_size": 65536 00:30:47.127 }, 00:30:47.127 { 00:30:47.127 "name": "BaseBdev3", 00:30:47.127 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:47.127 "is_configured": true, 00:30:47.127 "data_offset": 0, 00:30:47.127 "data_size": 65536 00:30:47.127 }, 00:30:47.127 { 00:30:47.127 "name": "BaseBdev4", 00:30:47.127 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:47.127 "is_configured": true, 00:30:47.127 "data_offset": 0, 00:30:47.127 "data_size": 65536 00:30:47.127 } 00:30:47.127 ] 00:30:47.127 }' 00:30:47.127 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:47.127 [2024-07-23 17:24:42.473927] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:30:47.127 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:47.127 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:47.127 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:47.127 17:24:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:47.695 [2024-07-23 17:24:42.907044] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:30:47.953 [2024-07-23 17:24:43.221635] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:30:47.953 [2024-07-23 17:24:43.222041] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.211 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:48.211 [2024-07-23 17:24:43.556376] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:30:48.469 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:48.469 "name": "raid_bdev1", 00:30:48.469 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:48.469 "strip_size_kb": 0, 00:30:48.469 "state": "online", 00:30:48.469 "raid_level": "raid1", 00:30:48.469 "superblock": false, 00:30:48.469 "num_base_bdevs": 4, 00:30:48.469 "num_base_bdevs_discovered": 3, 00:30:48.469 "num_base_bdevs_operational": 3, 00:30:48.469 "process": { 00:30:48.469 "type": "rebuild", 00:30:48.469 "target": "spare", 00:30:48.469 "progress": { 00:30:48.469 "blocks": 47104, 00:30:48.469 "percent": 71 00:30:48.469 } 00:30:48.469 }, 00:30:48.469 "base_bdevs_list": [ 00:30:48.469 { 00:30:48.469 "name": "spare", 00:30:48.469 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:48.469 "is_configured": true, 00:30:48.469 "data_offset": 0, 00:30:48.469 "data_size": 65536 00:30:48.469 }, 00:30:48.469 { 00:30:48.469 "name": null, 00:30:48.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:48.469 "is_configured": false, 00:30:48.469 "data_offset": 0, 00:30:48.469 "data_size": 65536 00:30:48.469 }, 00:30:48.469 { 00:30:48.469 "name": "BaseBdev3", 00:30:48.469 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:48.469 "is_configured": true, 00:30:48.469 "data_offset": 0, 00:30:48.469 "data_size": 65536 00:30:48.469 }, 00:30:48.469 { 00:30:48.469 "name": "BaseBdev4", 00:30:48.469 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:48.469 "is_configured": true, 00:30:48.469 "data_offset": 0, 00:30:48.469 "data_size": 65536 00:30:48.469 } 00:30:48.469 ] 00:30:48.469 }' 00:30:48.469 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:48.469 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:48.469 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:48.728 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:48.728 17:24:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:48.728 [2024-07-23 17:24:43.917279] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:30:48.728 [2024-07-23 17:24:44.043824] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:30:48.728 [2024-07-23 17:24:44.044276] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:30:49.666 [2024-07-23 17:24:44.751843] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:49.666 [2024-07-23 17:24:44.820007] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:49.666 [2024-07-23 17:24:44.822082] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:49.666 17:24:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:49.925 "name": "raid_bdev1", 00:30:49.925 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:49.925 "strip_size_kb": 0, 00:30:49.925 "state": "online", 00:30:49.925 "raid_level": "raid1", 00:30:49.925 "superblock": false, 00:30:49.925 "num_base_bdevs": 4, 00:30:49.925 "num_base_bdevs_discovered": 3, 00:30:49.925 "num_base_bdevs_operational": 3, 00:30:49.925 "base_bdevs_list": [ 00:30:49.925 { 00:30:49.925 "name": "spare", 00:30:49.925 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:49.925 "is_configured": true, 00:30:49.925 "data_offset": 0, 00:30:49.925 "data_size": 65536 00:30:49.925 }, 00:30:49.925 { 00:30:49.925 "name": null, 00:30:49.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:49.925 "is_configured": false, 00:30:49.925 "data_offset": 0, 00:30:49.925 "data_size": 65536 00:30:49.925 }, 00:30:49.925 { 00:30:49.925 "name": "BaseBdev3", 00:30:49.925 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:49.925 "is_configured": true, 00:30:49.925 "data_offset": 0, 00:30:49.925 "data_size": 65536 00:30:49.925 }, 00:30:49.925 { 00:30:49.925 "name": "BaseBdev4", 00:30:49.925 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:49.925 "is_configured": true, 00:30:49.925 "data_offset": 0, 00:30:49.925 "data_size": 65536 00:30:49.925 } 00:30:49.925 ] 00:30:49.925 }' 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:49.925 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:50.184 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:50.184 "name": "raid_bdev1", 00:30:50.184 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:50.184 "strip_size_kb": 0, 00:30:50.184 "state": "online", 00:30:50.184 "raid_level": "raid1", 00:30:50.184 "superblock": false, 00:30:50.184 "num_base_bdevs": 4, 00:30:50.184 "num_base_bdevs_discovered": 3, 00:30:50.184 "num_base_bdevs_operational": 3, 00:30:50.184 "base_bdevs_list": [ 00:30:50.184 { 00:30:50.184 "name": "spare", 00:30:50.184 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:50.184 "is_configured": true, 00:30:50.184 "data_offset": 0, 00:30:50.184 "data_size": 65536 00:30:50.184 }, 00:30:50.184 { 00:30:50.184 "name": null, 00:30:50.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:50.184 "is_configured": false, 00:30:50.184 "data_offset": 0, 00:30:50.184 "data_size": 65536 00:30:50.184 }, 00:30:50.184 { 00:30:50.184 "name": "BaseBdev3", 00:30:50.184 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:50.184 "is_configured": true, 00:30:50.184 "data_offset": 0, 00:30:50.184 "data_size": 65536 00:30:50.184 }, 00:30:50.184 { 00:30:50.184 "name": "BaseBdev4", 00:30:50.184 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:50.184 "is_configured": true, 00:30:50.184 "data_offset": 0, 00:30:50.184 "data_size": 65536 00:30:50.184 } 00:30:50.184 ] 00:30:50.184 }' 00:30:50.184 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:50.184 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:50.184 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:50.443 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:50.444 "name": "raid_bdev1", 00:30:50.444 "uuid": "c6fea367-4928-43e2-a9fc-2bb8fdaf15ec", 00:30:50.444 "strip_size_kb": 0, 00:30:50.444 "state": "online", 00:30:50.444 "raid_level": "raid1", 00:30:50.444 "superblock": false, 00:30:50.444 "num_base_bdevs": 4, 00:30:50.444 "num_base_bdevs_discovered": 3, 00:30:50.444 "num_base_bdevs_operational": 3, 00:30:50.444 "base_bdevs_list": [ 00:30:50.444 { 00:30:50.444 "name": "spare", 00:30:50.444 "uuid": "db2556d9-6b3d-504b-967b-77b6f68f27de", 00:30:50.444 "is_configured": true, 00:30:50.444 "data_offset": 0, 00:30:50.444 "data_size": 65536 00:30:50.444 }, 00:30:50.444 { 00:30:50.444 "name": null, 00:30:50.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:50.444 "is_configured": false, 00:30:50.444 "data_offset": 0, 00:30:50.444 "data_size": 65536 00:30:50.444 }, 00:30:50.444 { 00:30:50.444 "name": "BaseBdev3", 00:30:50.444 "uuid": "ae6f2daf-91b3-51bf-9605-8dad3dce011f", 00:30:50.444 "is_configured": true, 00:30:50.444 "data_offset": 0, 00:30:50.444 "data_size": 65536 00:30:50.444 }, 00:30:50.444 { 00:30:50.444 "name": "BaseBdev4", 00:30:50.444 "uuid": "8a59a17c-142a-5d16-9ea2-1312dff11384", 00:30:50.444 "is_configured": true, 00:30:50.444 "data_offset": 0, 00:30:50.444 "data_size": 65536 00:30:50.444 } 00:30:50.444 ] 00:30:50.444 }' 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:50.444 17:24:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:30:51.381 17:24:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:51.640 [2024-07-23 17:24:46.941946] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:51.640 [2024-07-23 17:24:46.941980] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:51.640 00:30:51.640 Latency(us) 00:30:51.640 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:51.640 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:30:51.640 raid_bdev1 : 11.87 96.56 289.67 0.00 0.00 14442.82 297.41 122181.90 00:30:51.640 =================================================================================================================== 00:30:51.640 Total : 96.56 289.67 0.00 0.00 14442.82 297.41 122181.90 00:30:51.640 [2024-07-23 17:24:47.038099] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:51.640 [2024-07-23 17:24:47.038128] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:51.640 [2024-07-23 17:24:47.038221] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:51.640 [2024-07-23 17:24:47.038233] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fd9530 name raid_bdev1, state offline 00:30:51.640 0 00:30:51.899 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.899 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:30:52.468 /dev/nbd0 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:52.468 1+0 records in 00:30:52.468 1+0 records out 00:30:52.468 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291467 s, 14.1 MB/s 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:30:52.468 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:52.727 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:52.727 17:24:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:30:52.727 /dev/nbd1 00:30:52.727 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:52.986 1+0 records in 00:30:52.986 1+0 records out 00:30:52.986 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289316 s, 14.2 MB/s 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:52.986 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:53.245 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:30:53.504 /dev/nbd1 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:53.504 1+0 records in 00:30:53.504 1+0 records out 00:30:53.504 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293276 s, 14.0 MB/s 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:53.504 17:24:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:53.764 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 51283 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 51283 ']' 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 51283 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:30:54.023 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:54.282 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 51283 00:30:54.282 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:54.282 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:54.282 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 51283' 00:30:54.282 killing process with pid 51283 00:30:54.282 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 51283 00:30:54.282 Received shutdown signal, test time was about 14.313904 seconds 00:30:54.282 00:30:54.282 Latency(us) 00:30:54.282 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:54.282 =================================================================================================================== 00:30:54.282 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:54.282 [2024-07-23 17:24:49.485368] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:54.282 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 51283 00:30:54.282 [2024-07-23 17:24:49.529067] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:30:54.542 00:30:54.542 real 0m19.989s 00:30:54.542 user 0m31.461s 00:30:54.542 sys 0m3.572s 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:30:54.542 ************************************ 00:30:54.542 END TEST raid_rebuild_test_io 00:30:54.542 ************************************ 00:30:54.542 17:24:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:54.542 17:24:49 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:30:54.542 17:24:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:54.542 17:24:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:54.542 17:24:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:54.542 ************************************ 00:30:54.542 START TEST raid_rebuild_test_sb_io 00:30:54.542 ************************************ 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:54.542 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=54082 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 54082 /var/tmp/spdk-raid.sock 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 54082 ']' 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:54.543 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:54.543 17:24:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:30:54.543 [2024-07-23 17:24:49.910994] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:30:54.543 [2024-07-23 17:24:49.911070] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid54082 ] 00:30:54.543 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:54.543 Zero copy mechanism will not be used. 00:30:54.802 [2024-07-23 17:24:50.046254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:54.802 [2024-07-23 17:24:50.106215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:54.802 [2024-07-23 17:24:50.174274] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:54.802 [2024-07-23 17:24:50.174315] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:55.370 17:24:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:55.370 17:24:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:30:55.370 17:24:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:55.370 17:24:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:30:55.629 BaseBdev1_malloc 00:30:55.629 17:24:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:55.887 [2024-07-23 17:24:51.261679] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:55.887 [2024-07-23 17:24:51.261735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:55.887 [2024-07-23 17:24:51.261758] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1170170 00:30:55.887 [2024-07-23 17:24:51.261770] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:55.887 [2024-07-23 17:24:51.263327] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:55.887 [2024-07-23 17:24:51.263358] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:55.887 BaseBdev1 00:30:55.888 17:24:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:55.888 17:24:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:30:56.146 BaseBdev2_malloc 00:30:56.146 17:24:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:30:56.405 [2024-07-23 17:24:51.759746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:30:56.405 [2024-07-23 17:24:51.759793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:56.405 [2024-07-23 17:24:51.759813] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1056680 00:30:56.405 [2024-07-23 17:24:51.759826] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:56.405 [2024-07-23 17:24:51.761271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:56.405 [2024-07-23 17:24:51.761300] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:56.405 BaseBdev2 00:30:56.405 17:24:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:56.405 17:24:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:30:56.664 BaseBdev3_malloc 00:30:56.664 17:24:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:30:56.922 [2024-07-23 17:24:52.265852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:30:56.922 [2024-07-23 17:24:52.265901] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:56.922 [2024-07-23 17:24:52.265924] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10573d0 00:30:56.922 [2024-07-23 17:24:52.265936] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:56.922 [2024-07-23 17:24:52.267282] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:56.922 [2024-07-23 17:24:52.267308] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:30:56.922 BaseBdev3 00:30:56.922 17:24:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:56.922 17:24:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:30:57.180 BaseBdev4_malloc 00:30:57.180 17:24:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:30:57.439 [2024-07-23 17:24:52.767696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:30:57.439 [2024-07-23 17:24:52.767742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:57.439 [2024-07-23 17:24:52.767763] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x105b2c0 00:30:57.439 [2024-07-23 17:24:52.767775] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:57.439 [2024-07-23 17:24:52.769191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:57.439 [2024-07-23 17:24:52.769219] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:30:57.439 BaseBdev4 00:30:57.439 17:24:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:30:57.698 spare_malloc 00:30:57.698 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:30:57.957 spare_delay 00:30:57.957 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:58.215 [2024-07-23 17:24:53.449968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:58.215 [2024-07-23 17:24:53.450012] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:58.216 [2024-07-23 17:24:53.450031] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1059e80 00:30:58.216 [2024-07-23 17:24:53.450049] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:58.216 [2024-07-23 17:24:53.451451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:58.216 [2024-07-23 17:24:53.451478] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:58.216 spare 00:30:58.216 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:30:58.474 [2024-07-23 17:24:53.702669] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:58.474 [2024-07-23 17:24:53.703813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:58.474 [2024-07-23 17:24:53.703866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:30:58.474 [2024-07-23 17:24:53.703918] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:30:58.474 [2024-07-23 17:24:53.704100] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfbf530 00:30:58.474 [2024-07-23 17:24:53.704112] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:30:58.474 [2024-07-23 17:24:53.704297] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbf500 00:30:58.474 [2024-07-23 17:24:53.704439] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfbf530 00:30:58.474 [2024-07-23 17:24:53.704450] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfbf530 00:30:58.474 [2024-07-23 17:24:53.704536] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.474 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:58.732 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:58.732 "name": "raid_bdev1", 00:30:58.732 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:30:58.732 "strip_size_kb": 0, 00:30:58.732 "state": "online", 00:30:58.732 "raid_level": "raid1", 00:30:58.732 "superblock": true, 00:30:58.732 "num_base_bdevs": 4, 00:30:58.732 "num_base_bdevs_discovered": 4, 00:30:58.732 "num_base_bdevs_operational": 4, 00:30:58.732 "base_bdevs_list": [ 00:30:58.732 { 00:30:58.732 "name": "BaseBdev1", 00:30:58.732 "uuid": "88dd9454-d504-5b89-a9b3-16cde1f2ec41", 00:30:58.732 "is_configured": true, 00:30:58.732 "data_offset": 2048, 00:30:58.732 "data_size": 63488 00:30:58.732 }, 00:30:58.732 { 00:30:58.732 "name": "BaseBdev2", 00:30:58.732 "uuid": "e43fde03-b6d9-5754-af81-046de090b7ef", 00:30:58.732 "is_configured": true, 00:30:58.732 "data_offset": 2048, 00:30:58.732 "data_size": 63488 00:30:58.732 }, 00:30:58.732 { 00:30:58.732 "name": "BaseBdev3", 00:30:58.732 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:30:58.732 "is_configured": true, 00:30:58.732 "data_offset": 2048, 00:30:58.732 "data_size": 63488 00:30:58.732 }, 00:30:58.732 { 00:30:58.732 "name": "BaseBdev4", 00:30:58.732 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:30:58.732 "is_configured": true, 00:30:58.732 "data_offset": 2048, 00:30:58.732 "data_size": 63488 00:30:58.732 } 00:30:58.732 ] 00:30:58.732 }' 00:30:58.732 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:58.732 17:24:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:30:59.299 17:24:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:59.299 17:24:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:30:59.558 [2024-07-23 17:24:54.741704] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:59.558 17:24:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:30:59.558 17:24:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:59.558 17:24:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:59.817 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:30:59.817 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:30:59.817 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:59.817 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:30:59.817 [2024-07-23 17:24:55.128502] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbf440 00:30:59.817 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:59.817 Zero copy mechanism will not be used. 00:30:59.817 Running I/O for 60 seconds... 00:31:00.076 [2024-07-23 17:24:55.246629] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:00.076 [2024-07-23 17:24:55.254840] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xfbf440 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:00.076 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:00.398 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:00.398 "name": "raid_bdev1", 00:31:00.398 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:00.398 "strip_size_kb": 0, 00:31:00.398 "state": "online", 00:31:00.398 "raid_level": "raid1", 00:31:00.398 "superblock": true, 00:31:00.398 "num_base_bdevs": 4, 00:31:00.398 "num_base_bdevs_discovered": 3, 00:31:00.398 "num_base_bdevs_operational": 3, 00:31:00.398 "base_bdevs_list": [ 00:31:00.398 { 00:31:00.398 "name": null, 00:31:00.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:00.398 "is_configured": false, 00:31:00.398 "data_offset": 2048, 00:31:00.398 "data_size": 63488 00:31:00.398 }, 00:31:00.398 { 00:31:00.398 "name": "BaseBdev2", 00:31:00.399 "uuid": "e43fde03-b6d9-5754-af81-046de090b7ef", 00:31:00.399 "is_configured": true, 00:31:00.399 "data_offset": 2048, 00:31:00.399 "data_size": 63488 00:31:00.399 }, 00:31:00.399 { 00:31:00.399 "name": "BaseBdev3", 00:31:00.399 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:00.399 "is_configured": true, 00:31:00.399 "data_offset": 2048, 00:31:00.399 "data_size": 63488 00:31:00.399 }, 00:31:00.399 { 00:31:00.399 "name": "BaseBdev4", 00:31:00.399 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:00.399 "is_configured": true, 00:31:00.399 "data_offset": 2048, 00:31:00.399 "data_size": 63488 00:31:00.399 } 00:31:00.399 ] 00:31:00.399 }' 00:31:00.399 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:00.399 17:24:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:00.970 17:24:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:01.229 [2024-07-23 17:24:56.414074] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:01.229 17:24:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:31:01.229 [2024-07-23 17:24:56.479581] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbfd00 00:31:01.229 [2024-07-23 17:24:56.481967] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:01.229 [2024-07-23 17:24:56.601552] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:31:01.229 [2024-07-23 17:24:56.602883] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:31:01.488 [2024-07-23 17:24:56.845695] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:31:01.488 [2024-07-23 17:24:56.845960] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:31:01.748 [2024-07-23 17:24:57.119494] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:31:02.007 [2024-07-23 17:24:57.362578] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:31:02.007 [2024-07-23 17:24:57.362855] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:02.266 "name": "raid_bdev1", 00:31:02.266 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:02.266 "strip_size_kb": 0, 00:31:02.266 "state": "online", 00:31:02.266 "raid_level": "raid1", 00:31:02.266 "superblock": true, 00:31:02.266 "num_base_bdevs": 4, 00:31:02.266 "num_base_bdevs_discovered": 4, 00:31:02.266 "num_base_bdevs_operational": 4, 00:31:02.266 "process": { 00:31:02.266 "type": "rebuild", 00:31:02.266 "target": "spare", 00:31:02.266 "progress": { 00:31:02.266 "blocks": 12288, 00:31:02.266 "percent": 19 00:31:02.266 } 00:31:02.266 }, 00:31:02.266 "base_bdevs_list": [ 00:31:02.266 { 00:31:02.266 "name": "spare", 00:31:02.266 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:02.266 "is_configured": true, 00:31:02.266 "data_offset": 2048, 00:31:02.266 "data_size": 63488 00:31:02.266 }, 00:31:02.266 { 00:31:02.266 "name": "BaseBdev2", 00:31:02.266 "uuid": "e43fde03-b6d9-5754-af81-046de090b7ef", 00:31:02.266 "is_configured": true, 00:31:02.266 "data_offset": 2048, 00:31:02.266 "data_size": 63488 00:31:02.266 }, 00:31:02.266 { 00:31:02.266 "name": "BaseBdev3", 00:31:02.266 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:02.266 "is_configured": true, 00:31:02.266 "data_offset": 2048, 00:31:02.266 "data_size": 63488 00:31:02.266 }, 00:31:02.266 { 00:31:02.266 "name": "BaseBdev4", 00:31:02.266 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:02.266 "is_configured": true, 00:31:02.266 "data_offset": 2048, 00:31:02.266 "data_size": 63488 00:31:02.266 } 00:31:02.266 ] 00:31:02.266 }' 00:31:02.266 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:02.525 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:02.525 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:02.525 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:02.525 17:24:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:31:02.784 [2024-07-23 17:24:57.986188] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:02.784 [2024-07-23 17:24:58.069548] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:31:02.784 [2024-07-23 17:24:58.078186] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:02.784 [2024-07-23 17:24:58.089850] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:02.784 [2024-07-23 17:24:58.089881] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:02.784 [2024-07-23 17:24:58.089898] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:02.784 [2024-07-23 17:24:58.112934] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xfbf440 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.784 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:03.043 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:03.043 "name": "raid_bdev1", 00:31:03.043 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:03.043 "strip_size_kb": 0, 00:31:03.043 "state": "online", 00:31:03.043 "raid_level": "raid1", 00:31:03.043 "superblock": true, 00:31:03.043 "num_base_bdevs": 4, 00:31:03.043 "num_base_bdevs_discovered": 3, 00:31:03.043 "num_base_bdevs_operational": 3, 00:31:03.044 "base_bdevs_list": [ 00:31:03.044 { 00:31:03.044 "name": null, 00:31:03.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:03.044 "is_configured": false, 00:31:03.044 "data_offset": 2048, 00:31:03.044 "data_size": 63488 00:31:03.044 }, 00:31:03.044 { 00:31:03.044 "name": "BaseBdev2", 00:31:03.044 "uuid": "e43fde03-b6d9-5754-af81-046de090b7ef", 00:31:03.044 "is_configured": true, 00:31:03.044 "data_offset": 2048, 00:31:03.044 "data_size": 63488 00:31:03.044 }, 00:31:03.044 { 00:31:03.044 "name": "BaseBdev3", 00:31:03.044 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:03.044 "is_configured": true, 00:31:03.044 "data_offset": 2048, 00:31:03.044 "data_size": 63488 00:31:03.044 }, 00:31:03.044 { 00:31:03.044 "name": "BaseBdev4", 00:31:03.044 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:03.044 "is_configured": true, 00:31:03.044 "data_offset": 2048, 00:31:03.044 "data_size": 63488 00:31:03.044 } 00:31:03.044 ] 00:31:03.044 }' 00:31:03.044 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:03.044 17:24:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:03.612 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:03.612 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:03.612 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:03.612 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:03.612 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:03.872 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:03.872 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:03.872 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:03.872 "name": "raid_bdev1", 00:31:03.872 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:03.872 "strip_size_kb": 0, 00:31:03.872 "state": "online", 00:31:03.872 "raid_level": "raid1", 00:31:03.872 "superblock": true, 00:31:03.872 "num_base_bdevs": 4, 00:31:03.872 "num_base_bdevs_discovered": 3, 00:31:03.872 "num_base_bdevs_operational": 3, 00:31:03.872 "base_bdevs_list": [ 00:31:03.872 { 00:31:03.872 "name": null, 00:31:03.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:03.872 "is_configured": false, 00:31:03.872 "data_offset": 2048, 00:31:03.872 "data_size": 63488 00:31:03.872 }, 00:31:03.872 { 00:31:03.872 "name": "BaseBdev2", 00:31:03.872 "uuid": "e43fde03-b6d9-5754-af81-046de090b7ef", 00:31:03.872 "is_configured": true, 00:31:03.872 "data_offset": 2048, 00:31:03.872 "data_size": 63488 00:31:03.872 }, 00:31:03.872 { 00:31:03.872 "name": "BaseBdev3", 00:31:03.872 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:03.872 "is_configured": true, 00:31:03.872 "data_offset": 2048, 00:31:03.872 "data_size": 63488 00:31:03.872 }, 00:31:03.872 { 00:31:03.872 "name": "BaseBdev4", 00:31:03.872 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:03.872 "is_configured": true, 00:31:03.872 "data_offset": 2048, 00:31:03.872 "data_size": 63488 00:31:03.872 } 00:31:03.872 ] 00:31:03.872 }' 00:31:03.872 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:03.872 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:03.872 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:04.131 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:04.131 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:04.131 [2024-07-23 17:24:59.534475] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:04.391 [2024-07-23 17:24:59.591328] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1073ee0 00:31:04.391 [2024-07-23 17:24:59.592828] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:04.391 17:24:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:31:04.391 [2024-07-23 17:24:59.732758] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:31:04.391 [2024-07-23 17:24:59.733175] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:31:04.650 [2024-07-23 17:24:59.855226] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:31:04.650 [2024-07-23 17:24:59.855532] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:31:04.909 [2024-07-23 17:25:00.242435] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:31:05.168 [2024-07-23 17:25:00.456826] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:31:05.168 [2024-07-23 17:25:00.457019] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.427 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:05.427 [2024-07-23 17:25:00.750350] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:05.692 "name": "raid_bdev1", 00:31:05.692 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:05.692 "strip_size_kb": 0, 00:31:05.692 "state": "online", 00:31:05.692 "raid_level": "raid1", 00:31:05.692 "superblock": true, 00:31:05.692 "num_base_bdevs": 4, 00:31:05.692 "num_base_bdevs_discovered": 4, 00:31:05.692 "num_base_bdevs_operational": 4, 00:31:05.692 "process": { 00:31:05.692 "type": "rebuild", 00:31:05.692 "target": "spare", 00:31:05.692 "progress": { 00:31:05.692 "blocks": 14336, 00:31:05.692 "percent": 22 00:31:05.692 } 00:31:05.692 }, 00:31:05.692 "base_bdevs_list": [ 00:31:05.692 { 00:31:05.692 "name": "spare", 00:31:05.692 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:05.692 "is_configured": true, 00:31:05.692 "data_offset": 2048, 00:31:05.692 "data_size": 63488 00:31:05.692 }, 00:31:05.692 { 00:31:05.692 "name": "BaseBdev2", 00:31:05.692 "uuid": "e43fde03-b6d9-5754-af81-046de090b7ef", 00:31:05.692 "is_configured": true, 00:31:05.692 "data_offset": 2048, 00:31:05.692 "data_size": 63488 00:31:05.692 }, 00:31:05.692 { 00:31:05.692 "name": "BaseBdev3", 00:31:05.692 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:05.692 "is_configured": true, 00:31:05.692 "data_offset": 2048, 00:31:05.692 "data_size": 63488 00:31:05.692 }, 00:31:05.692 { 00:31:05.692 "name": "BaseBdev4", 00:31:05.692 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:05.692 "is_configured": true, 00:31:05.692 "data_offset": 2048, 00:31:05.692 "data_size": 63488 00:31:05.692 } 00:31:05.692 ] 00:31:05.692 }' 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:31:05.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:31:05.692 17:25:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:31:05.692 [2024-07-23 17:25:00.962574] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:31:05.692 [2024-07-23 17:25:00.962845] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:31:05.952 [2024-07-23 17:25:01.180926] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:06.211 [2024-07-23 17:25:01.390735] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xfbf440 00:31:06.211 [2024-07-23 17:25:01.390765] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1073ee0 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:06.211 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:06.211 [2024-07-23 17:25:01.531565] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:31:06.469 [2024-07-23 17:25:01.702080] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:06.469 "name": "raid_bdev1", 00:31:06.469 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:06.469 "strip_size_kb": 0, 00:31:06.469 "state": "online", 00:31:06.469 "raid_level": "raid1", 00:31:06.469 "superblock": true, 00:31:06.469 "num_base_bdevs": 4, 00:31:06.469 "num_base_bdevs_discovered": 3, 00:31:06.469 "num_base_bdevs_operational": 3, 00:31:06.469 "process": { 00:31:06.469 "type": "rebuild", 00:31:06.469 "target": "spare", 00:31:06.469 "progress": { 00:31:06.469 "blocks": 20480, 00:31:06.469 "percent": 32 00:31:06.469 } 00:31:06.469 }, 00:31:06.469 "base_bdevs_list": [ 00:31:06.469 { 00:31:06.469 "name": "spare", 00:31:06.469 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:06.469 "is_configured": true, 00:31:06.469 "data_offset": 2048, 00:31:06.469 "data_size": 63488 00:31:06.469 }, 00:31:06.469 { 00:31:06.469 "name": null, 00:31:06.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:06.469 "is_configured": false, 00:31:06.469 "data_offset": 2048, 00:31:06.469 "data_size": 63488 00:31:06.469 }, 00:31:06.469 { 00:31:06.469 "name": "BaseBdev3", 00:31:06.469 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:06.469 "is_configured": true, 00:31:06.469 "data_offset": 2048, 00:31:06.469 "data_size": 63488 00:31:06.469 }, 00:31:06.469 { 00:31:06.469 "name": "BaseBdev4", 00:31:06.469 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:06.469 "is_configured": true, 00:31:06.469 "data_offset": 2048, 00:31:06.469 "data_size": 63488 00:31:06.469 } 00:31:06.469 ] 00:31:06.469 }' 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=1001 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:06.469 17:25:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:06.728 [2024-07-23 17:25:02.025567] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:31:06.728 17:25:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:06.728 "name": "raid_bdev1", 00:31:06.728 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:06.728 "strip_size_kb": 0, 00:31:06.728 "state": "online", 00:31:06.728 "raid_level": "raid1", 00:31:06.728 "superblock": true, 00:31:06.728 "num_base_bdevs": 4, 00:31:06.728 "num_base_bdevs_discovered": 3, 00:31:06.728 "num_base_bdevs_operational": 3, 00:31:06.728 "process": { 00:31:06.728 "type": "rebuild", 00:31:06.728 "target": "spare", 00:31:06.728 "progress": { 00:31:06.728 "blocks": 26624, 00:31:06.728 "percent": 41 00:31:06.728 } 00:31:06.728 }, 00:31:06.728 "base_bdevs_list": [ 00:31:06.728 { 00:31:06.728 "name": "spare", 00:31:06.728 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:06.728 "is_configured": true, 00:31:06.729 "data_offset": 2048, 00:31:06.729 "data_size": 63488 00:31:06.729 }, 00:31:06.729 { 00:31:06.729 "name": null, 00:31:06.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:06.729 "is_configured": false, 00:31:06.729 "data_offset": 2048, 00:31:06.729 "data_size": 63488 00:31:06.729 }, 00:31:06.729 { 00:31:06.729 "name": "BaseBdev3", 00:31:06.729 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:06.729 "is_configured": true, 00:31:06.729 "data_offset": 2048, 00:31:06.729 "data_size": 63488 00:31:06.729 }, 00:31:06.729 { 00:31:06.729 "name": "BaseBdev4", 00:31:06.729 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:06.729 "is_configured": true, 00:31:06.729 "data_offset": 2048, 00:31:06.729 "data_size": 63488 00:31:06.729 } 00:31:06.729 ] 00:31:06.729 }' 00:31:06.729 17:25:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:06.729 17:25:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:06.729 17:25:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:06.729 [2024-07-23 17:25:02.144952] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:31:06.988 17:25:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:06.988 17:25:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:31:07.247 [2024-07-23 17:25:02.651722] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:31:07.815 [2024-07-23 17:25:03.020925] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:07.816 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:08.075 [2024-07-23 17:25:03.354112] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:31:08.075 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:08.075 "name": "raid_bdev1", 00:31:08.075 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:08.075 "strip_size_kb": 0, 00:31:08.075 "state": "online", 00:31:08.075 "raid_level": "raid1", 00:31:08.075 "superblock": true, 00:31:08.075 "num_base_bdevs": 4, 00:31:08.075 "num_base_bdevs_discovered": 3, 00:31:08.075 "num_base_bdevs_operational": 3, 00:31:08.075 "process": { 00:31:08.075 "type": "rebuild", 00:31:08.075 "target": "spare", 00:31:08.075 "progress": { 00:31:08.075 "blocks": 47104, 00:31:08.075 "percent": 74 00:31:08.075 } 00:31:08.075 }, 00:31:08.075 "base_bdevs_list": [ 00:31:08.075 { 00:31:08.075 "name": "spare", 00:31:08.075 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:08.075 "is_configured": true, 00:31:08.075 "data_offset": 2048, 00:31:08.075 "data_size": 63488 00:31:08.075 }, 00:31:08.075 { 00:31:08.075 "name": null, 00:31:08.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:08.075 "is_configured": false, 00:31:08.075 "data_offset": 2048, 00:31:08.075 "data_size": 63488 00:31:08.075 }, 00:31:08.075 { 00:31:08.075 "name": "BaseBdev3", 00:31:08.075 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:08.075 "is_configured": true, 00:31:08.075 "data_offset": 2048, 00:31:08.075 "data_size": 63488 00:31:08.075 }, 00:31:08.075 { 00:31:08.075 "name": "BaseBdev4", 00:31:08.075 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:08.075 "is_configured": true, 00:31:08.075 "data_offset": 2048, 00:31:08.075 "data_size": 63488 00:31:08.075 } 00:31:08.075 ] 00:31:08.075 }' 00:31:08.075 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:08.075 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:08.075 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:08.334 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:08.334 17:25:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:31:08.334 [2024-07-23 17:25:03.584386] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:31:08.901 [2024-07-23 17:25:04.053379] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:31:08.901 [2024-07-23 17:25:04.284197] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:09.159 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:09.417 [2024-07-23 17:25:04.624901] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:31:09.417 [2024-07-23 17:25:04.723589] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:31:09.417 [2024-07-23 17:25:04.734446] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:09.417 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:09.417 "name": "raid_bdev1", 00:31:09.417 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:09.417 "strip_size_kb": 0, 00:31:09.417 "state": "online", 00:31:09.417 "raid_level": "raid1", 00:31:09.417 "superblock": true, 00:31:09.417 "num_base_bdevs": 4, 00:31:09.417 "num_base_bdevs_discovered": 3, 00:31:09.417 "num_base_bdevs_operational": 3, 00:31:09.417 "base_bdevs_list": [ 00:31:09.417 { 00:31:09.417 "name": "spare", 00:31:09.417 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:09.417 "is_configured": true, 00:31:09.417 "data_offset": 2048, 00:31:09.417 "data_size": 63488 00:31:09.417 }, 00:31:09.417 { 00:31:09.417 "name": null, 00:31:09.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:09.417 "is_configured": false, 00:31:09.417 "data_offset": 2048, 00:31:09.417 "data_size": 63488 00:31:09.417 }, 00:31:09.417 { 00:31:09.417 "name": "BaseBdev3", 00:31:09.417 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:09.417 "is_configured": true, 00:31:09.417 "data_offset": 2048, 00:31:09.417 "data_size": 63488 00:31:09.417 }, 00:31:09.417 { 00:31:09.417 "name": "BaseBdev4", 00:31:09.417 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:09.417 "is_configured": true, 00:31:09.417 "data_offset": 2048, 00:31:09.417 "data_size": 63488 00:31:09.417 } 00:31:09.417 ] 00:31:09.417 }' 00:31:09.417 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:09.417 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:31:09.417 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:09.676 17:25:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:09.935 "name": "raid_bdev1", 00:31:09.935 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:09.935 "strip_size_kb": 0, 00:31:09.935 "state": "online", 00:31:09.935 "raid_level": "raid1", 00:31:09.935 "superblock": true, 00:31:09.935 "num_base_bdevs": 4, 00:31:09.935 "num_base_bdevs_discovered": 3, 00:31:09.935 "num_base_bdevs_operational": 3, 00:31:09.935 "base_bdevs_list": [ 00:31:09.935 { 00:31:09.935 "name": "spare", 00:31:09.935 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:09.935 "is_configured": true, 00:31:09.935 "data_offset": 2048, 00:31:09.935 "data_size": 63488 00:31:09.935 }, 00:31:09.935 { 00:31:09.935 "name": null, 00:31:09.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:09.935 "is_configured": false, 00:31:09.935 "data_offset": 2048, 00:31:09.935 "data_size": 63488 00:31:09.935 }, 00:31:09.935 { 00:31:09.935 "name": "BaseBdev3", 00:31:09.935 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:09.935 "is_configured": true, 00:31:09.935 "data_offset": 2048, 00:31:09.935 "data_size": 63488 00:31:09.935 }, 00:31:09.935 { 00:31:09.935 "name": "BaseBdev4", 00:31:09.935 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:09.935 "is_configured": true, 00:31:09.935 "data_offset": 2048, 00:31:09.935 "data_size": 63488 00:31:09.935 } 00:31:09.935 ] 00:31:09.935 }' 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:09.935 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:10.193 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:10.193 "name": "raid_bdev1", 00:31:10.193 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:10.193 "strip_size_kb": 0, 00:31:10.193 "state": "online", 00:31:10.193 "raid_level": "raid1", 00:31:10.193 "superblock": true, 00:31:10.193 "num_base_bdevs": 4, 00:31:10.193 "num_base_bdevs_discovered": 3, 00:31:10.193 "num_base_bdevs_operational": 3, 00:31:10.193 "base_bdevs_list": [ 00:31:10.193 { 00:31:10.193 "name": "spare", 00:31:10.193 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:10.193 "is_configured": true, 00:31:10.193 "data_offset": 2048, 00:31:10.193 "data_size": 63488 00:31:10.193 }, 00:31:10.193 { 00:31:10.193 "name": null, 00:31:10.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:10.193 "is_configured": false, 00:31:10.193 "data_offset": 2048, 00:31:10.193 "data_size": 63488 00:31:10.193 }, 00:31:10.193 { 00:31:10.193 "name": "BaseBdev3", 00:31:10.193 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:10.193 "is_configured": true, 00:31:10.193 "data_offset": 2048, 00:31:10.193 "data_size": 63488 00:31:10.193 }, 00:31:10.193 { 00:31:10.193 "name": "BaseBdev4", 00:31:10.193 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:10.193 "is_configured": true, 00:31:10.193 "data_offset": 2048, 00:31:10.193 "data_size": 63488 00:31:10.193 } 00:31:10.193 ] 00:31:10.193 }' 00:31:10.193 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:10.193 17:25:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:10.758 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:11.016 [2024-07-23 17:25:06.291622] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:11.016 [2024-07-23 17:25:06.291659] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:11.016 00:31:11.016 Latency(us) 00:31:11.016 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:11.016 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:31:11.016 raid_bdev1 : 11.23 90.44 271.33 0.00 0.00 14074.65 284.94 120358.29 00:31:11.016 =================================================================================================================== 00:31:11.016 Total : 90.44 271.33 0.00 0.00 14074.65 284.94 120358.29 00:31:11.016 [2024-07-23 17:25:06.395872] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:11.016 [2024-07-23 17:25:06.395909] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:11.016 [2024-07-23 17:25:06.396000] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:11.016 [2024-07-23 17:25:06.396012] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbf530 name raid_bdev1, state offline 00:31:11.016 0 00:31:11.016 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.016 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:31:11.274 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:31:11.532 /dev/nbd0 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:11.532 1+0 records in 00:31:11.532 1+0 records out 00:31:11.532 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273948 s, 15.0 MB/s 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:31:11.532 17:25:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:31:11.790 /dev/nbd1 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:11.790 1+0 records in 00:31:11.790 1+0 records out 00:31:11.790 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312531 s, 13.1 MB/s 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:31:11.790 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:12.049 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:12.306 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:31:12.307 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:31:12.565 /dev/nbd1 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:12.565 1+0 records in 00:31:12.565 1+0 records out 00:31:12.565 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291362 s, 14.1 MB/s 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:12.565 17:25:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:31:12.822 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:12.823 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:31:13.080 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:13.338 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:13.596 [2024-07-23 17:25:08.776451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:13.596 [2024-07-23 17:25:08.776496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:13.596 [2024-07-23 17:25:08.776515] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x105a0b0 00:31:13.596 [2024-07-23 17:25:08.776527] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:13.596 [2024-07-23 17:25:08.778234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:13.596 [2024-07-23 17:25:08.778261] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:13.596 [2024-07-23 17:25:08.778340] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:13.597 [2024-07-23 17:25:08.778366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:13.597 [2024-07-23 17:25:08.778466] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:31:13.597 [2024-07-23 17:25:08.778537] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:31:13.597 spare 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:13.597 [2024-07-23 17:25:08.878851] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfbd000 00:31:13.597 [2024-07-23 17:25:08.878868] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:31:13.597 [2024-07-23 17:25:08.879055] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10740f0 00:31:13.597 [2024-07-23 17:25:08.879204] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfbd000 00:31:13.597 [2024-07-23 17:25:08.879215] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfbd000 00:31:13.597 [2024-07-23 17:25:08.879313] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:13.597 "name": "raid_bdev1", 00:31:13.597 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:13.597 "strip_size_kb": 0, 00:31:13.597 "state": "online", 00:31:13.597 "raid_level": "raid1", 00:31:13.597 "superblock": true, 00:31:13.597 "num_base_bdevs": 4, 00:31:13.597 "num_base_bdevs_discovered": 3, 00:31:13.597 "num_base_bdevs_operational": 3, 00:31:13.597 "base_bdevs_list": [ 00:31:13.597 { 00:31:13.597 "name": "spare", 00:31:13.597 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:13.597 "is_configured": true, 00:31:13.597 "data_offset": 2048, 00:31:13.597 "data_size": 63488 00:31:13.597 }, 00:31:13.597 { 00:31:13.597 "name": null, 00:31:13.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:13.597 "is_configured": false, 00:31:13.597 "data_offset": 2048, 00:31:13.597 "data_size": 63488 00:31:13.597 }, 00:31:13.597 { 00:31:13.597 "name": "BaseBdev3", 00:31:13.597 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:13.597 "is_configured": true, 00:31:13.597 "data_offset": 2048, 00:31:13.597 "data_size": 63488 00:31:13.597 }, 00:31:13.597 { 00:31:13.597 "name": "BaseBdev4", 00:31:13.597 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:13.597 "is_configured": true, 00:31:13.597 "data_offset": 2048, 00:31:13.597 "data_size": 63488 00:31:13.597 } 00:31:13.597 ] 00:31:13.597 }' 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:13.597 17:25:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:14.164 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:14.452 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:14.452 "name": "raid_bdev1", 00:31:14.452 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:14.452 "strip_size_kb": 0, 00:31:14.452 "state": "online", 00:31:14.452 "raid_level": "raid1", 00:31:14.452 "superblock": true, 00:31:14.452 "num_base_bdevs": 4, 00:31:14.452 "num_base_bdevs_discovered": 3, 00:31:14.452 "num_base_bdevs_operational": 3, 00:31:14.452 "base_bdevs_list": [ 00:31:14.452 { 00:31:14.452 "name": "spare", 00:31:14.452 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:14.452 "is_configured": true, 00:31:14.452 "data_offset": 2048, 00:31:14.452 "data_size": 63488 00:31:14.452 }, 00:31:14.452 { 00:31:14.452 "name": null, 00:31:14.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:14.452 "is_configured": false, 00:31:14.452 "data_offset": 2048, 00:31:14.452 "data_size": 63488 00:31:14.452 }, 00:31:14.452 { 00:31:14.452 "name": "BaseBdev3", 00:31:14.452 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:14.452 "is_configured": true, 00:31:14.452 "data_offset": 2048, 00:31:14.452 "data_size": 63488 00:31:14.452 }, 00:31:14.452 { 00:31:14.452 "name": "BaseBdev4", 00:31:14.452 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:14.452 "is_configured": true, 00:31:14.452 "data_offset": 2048, 00:31:14.452 "data_size": 63488 00:31:14.452 } 00:31:14.452 ] 00:31:14.452 }' 00:31:14.452 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:14.710 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:14.710 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:14.710 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:14.710 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:14.710 17:25:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:31:14.969 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:31:14.969 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:31:15.228 [2024-07-23 17:25:10.393089] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.228 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:15.486 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:15.486 "name": "raid_bdev1", 00:31:15.486 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:15.486 "strip_size_kb": 0, 00:31:15.486 "state": "online", 00:31:15.486 "raid_level": "raid1", 00:31:15.486 "superblock": true, 00:31:15.486 "num_base_bdevs": 4, 00:31:15.486 "num_base_bdevs_discovered": 2, 00:31:15.486 "num_base_bdevs_operational": 2, 00:31:15.486 "base_bdevs_list": [ 00:31:15.486 { 00:31:15.486 "name": null, 00:31:15.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:15.486 "is_configured": false, 00:31:15.486 "data_offset": 2048, 00:31:15.486 "data_size": 63488 00:31:15.486 }, 00:31:15.486 { 00:31:15.486 "name": null, 00:31:15.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:15.486 "is_configured": false, 00:31:15.486 "data_offset": 2048, 00:31:15.486 "data_size": 63488 00:31:15.486 }, 00:31:15.486 { 00:31:15.486 "name": "BaseBdev3", 00:31:15.486 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:15.486 "is_configured": true, 00:31:15.486 "data_offset": 2048, 00:31:15.486 "data_size": 63488 00:31:15.486 }, 00:31:15.486 { 00:31:15.486 "name": "BaseBdev4", 00:31:15.486 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:15.486 "is_configured": true, 00:31:15.487 "data_offset": 2048, 00:31:15.487 "data_size": 63488 00:31:15.487 } 00:31:15.487 ] 00:31:15.487 }' 00:31:15.487 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:15.487 17:25:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:16.053 17:25:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:16.312 [2024-07-23 17:25:11.532310] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:16.312 [2024-07-23 17:25:11.532471] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:31:16.312 [2024-07-23 17:25:11.532487] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:16.312 [2024-07-23 17:25:11.532514] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:16.312 [2024-07-23 17:25:11.536930] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1073e60 00:31:16.312 [2024-07-23 17:25:11.539263] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:16.312 17:25:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.247 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:17.505 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:17.505 "name": "raid_bdev1", 00:31:17.505 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:17.505 "strip_size_kb": 0, 00:31:17.505 "state": "online", 00:31:17.505 "raid_level": "raid1", 00:31:17.505 "superblock": true, 00:31:17.506 "num_base_bdevs": 4, 00:31:17.506 "num_base_bdevs_discovered": 3, 00:31:17.506 "num_base_bdevs_operational": 3, 00:31:17.506 "process": { 00:31:17.506 "type": "rebuild", 00:31:17.506 "target": "spare", 00:31:17.506 "progress": { 00:31:17.506 "blocks": 24576, 00:31:17.506 "percent": 38 00:31:17.506 } 00:31:17.506 }, 00:31:17.506 "base_bdevs_list": [ 00:31:17.506 { 00:31:17.506 "name": "spare", 00:31:17.506 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:17.506 "is_configured": true, 00:31:17.506 "data_offset": 2048, 00:31:17.506 "data_size": 63488 00:31:17.506 }, 00:31:17.506 { 00:31:17.506 "name": null, 00:31:17.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:17.506 "is_configured": false, 00:31:17.506 "data_offset": 2048, 00:31:17.506 "data_size": 63488 00:31:17.506 }, 00:31:17.506 { 00:31:17.506 "name": "BaseBdev3", 00:31:17.506 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:17.506 "is_configured": true, 00:31:17.506 "data_offset": 2048, 00:31:17.506 "data_size": 63488 00:31:17.506 }, 00:31:17.506 { 00:31:17.506 "name": "BaseBdev4", 00:31:17.506 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:17.506 "is_configured": true, 00:31:17.506 "data_offset": 2048, 00:31:17.506 "data_size": 63488 00:31:17.506 } 00:31:17.506 ] 00:31:17.506 }' 00:31:17.506 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:17.506 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:17.506 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:17.506 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:17.506 17:25:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:17.765 [2024-07-23 17:25:13.151863] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:17.765 [2024-07-23 17:25:13.152219] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:17.765 [2024-07-23 17:25:13.152262] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:17.765 [2024-07-23 17:25:13.152278] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:17.765 [2024-07-23 17:25:13.152285] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.765 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:18.023 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:18.023 "name": "raid_bdev1", 00:31:18.023 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:18.023 "strip_size_kb": 0, 00:31:18.023 "state": "online", 00:31:18.023 "raid_level": "raid1", 00:31:18.023 "superblock": true, 00:31:18.023 "num_base_bdevs": 4, 00:31:18.023 "num_base_bdevs_discovered": 2, 00:31:18.024 "num_base_bdevs_operational": 2, 00:31:18.024 "base_bdevs_list": [ 00:31:18.024 { 00:31:18.024 "name": null, 00:31:18.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:18.024 "is_configured": false, 00:31:18.024 "data_offset": 2048, 00:31:18.024 "data_size": 63488 00:31:18.024 }, 00:31:18.024 { 00:31:18.024 "name": null, 00:31:18.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:18.024 "is_configured": false, 00:31:18.024 "data_offset": 2048, 00:31:18.024 "data_size": 63488 00:31:18.024 }, 00:31:18.024 { 00:31:18.024 "name": "BaseBdev3", 00:31:18.024 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:18.024 "is_configured": true, 00:31:18.024 "data_offset": 2048, 00:31:18.024 "data_size": 63488 00:31:18.024 }, 00:31:18.024 { 00:31:18.024 "name": "BaseBdev4", 00:31:18.024 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:18.024 "is_configured": true, 00:31:18.024 "data_offset": 2048, 00:31:18.024 "data_size": 63488 00:31:18.024 } 00:31:18.024 ] 00:31:18.024 }' 00:31:18.024 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:18.024 17:25:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:18.957 17:25:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:18.957 [2024-07-23 17:25:14.243539] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:18.957 [2024-07-23 17:25:14.243596] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:18.957 [2024-07-23 17:25:14.243623] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbd470 00:31:18.957 [2024-07-23 17:25:14.243635] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:18.957 [2024-07-23 17:25:14.244017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:18.957 [2024-07-23 17:25:14.244034] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:18.957 [2024-07-23 17:25:14.244118] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:18.957 [2024-07-23 17:25:14.244129] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:31:18.957 [2024-07-23 17:25:14.244140] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:18.957 [2024-07-23 17:25:14.244159] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:18.957 [2024-07-23 17:25:14.248495] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfaab60 00:31:18.957 spare 00:31:18.957 [2024-07-23 17:25:14.249865] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:18.957 17:25:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:19.892 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:20.150 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:20.150 "name": "raid_bdev1", 00:31:20.150 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:20.150 "strip_size_kb": 0, 00:31:20.150 "state": "online", 00:31:20.150 "raid_level": "raid1", 00:31:20.150 "superblock": true, 00:31:20.150 "num_base_bdevs": 4, 00:31:20.150 "num_base_bdevs_discovered": 3, 00:31:20.150 "num_base_bdevs_operational": 3, 00:31:20.150 "process": { 00:31:20.150 "type": "rebuild", 00:31:20.150 "target": "spare", 00:31:20.150 "progress": { 00:31:20.150 "blocks": 24576, 00:31:20.150 "percent": 38 00:31:20.150 } 00:31:20.150 }, 00:31:20.150 "base_bdevs_list": [ 00:31:20.150 { 00:31:20.150 "name": "spare", 00:31:20.150 "uuid": "afbd4e89-f68c-57a3-aaf8-7e3644bfacff", 00:31:20.150 "is_configured": true, 00:31:20.150 "data_offset": 2048, 00:31:20.150 "data_size": 63488 00:31:20.150 }, 00:31:20.150 { 00:31:20.150 "name": null, 00:31:20.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:20.150 "is_configured": false, 00:31:20.150 "data_offset": 2048, 00:31:20.150 "data_size": 63488 00:31:20.150 }, 00:31:20.150 { 00:31:20.150 "name": "BaseBdev3", 00:31:20.150 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:20.150 "is_configured": true, 00:31:20.150 "data_offset": 2048, 00:31:20.150 "data_size": 63488 00:31:20.150 }, 00:31:20.150 { 00:31:20.150 "name": "BaseBdev4", 00:31:20.150 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:20.150 "is_configured": true, 00:31:20.150 "data_offset": 2048, 00:31:20.150 "data_size": 63488 00:31:20.150 } 00:31:20.150 ] 00:31:20.150 }' 00:31:20.150 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:20.409 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:20.409 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:20.409 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:20.409 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:20.667 [2024-07-23 17:25:15.845880] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:20.667 [2024-07-23 17:25:15.862354] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:20.667 [2024-07-23 17:25:15.862399] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:20.667 [2024-07-23 17:25:15.862415] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:20.668 [2024-07-23 17:25:15.862424] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:20.668 17:25:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:20.926 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:20.926 "name": "raid_bdev1", 00:31:20.926 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:20.926 "strip_size_kb": 0, 00:31:20.926 "state": "online", 00:31:20.926 "raid_level": "raid1", 00:31:20.926 "superblock": true, 00:31:20.926 "num_base_bdevs": 4, 00:31:20.926 "num_base_bdevs_discovered": 2, 00:31:20.926 "num_base_bdevs_operational": 2, 00:31:20.926 "base_bdevs_list": [ 00:31:20.926 { 00:31:20.926 "name": null, 00:31:20.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:20.926 "is_configured": false, 00:31:20.926 "data_offset": 2048, 00:31:20.926 "data_size": 63488 00:31:20.926 }, 00:31:20.926 { 00:31:20.926 "name": null, 00:31:20.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:20.926 "is_configured": false, 00:31:20.926 "data_offset": 2048, 00:31:20.926 "data_size": 63488 00:31:20.926 }, 00:31:20.926 { 00:31:20.926 "name": "BaseBdev3", 00:31:20.926 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:20.926 "is_configured": true, 00:31:20.926 "data_offset": 2048, 00:31:20.926 "data_size": 63488 00:31:20.926 }, 00:31:20.926 { 00:31:20.926 "name": "BaseBdev4", 00:31:20.926 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:20.926 "is_configured": true, 00:31:20.926 "data_offset": 2048, 00:31:20.926 "data_size": 63488 00:31:20.926 } 00:31:20.926 ] 00:31:20.926 }' 00:31:20.926 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:20.926 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:21.493 17:25:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:21.751 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:21.751 "name": "raid_bdev1", 00:31:21.751 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:21.751 "strip_size_kb": 0, 00:31:21.751 "state": "online", 00:31:21.751 "raid_level": "raid1", 00:31:21.751 "superblock": true, 00:31:21.751 "num_base_bdevs": 4, 00:31:21.751 "num_base_bdevs_discovered": 2, 00:31:21.751 "num_base_bdevs_operational": 2, 00:31:21.751 "base_bdevs_list": [ 00:31:21.751 { 00:31:21.751 "name": null, 00:31:21.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:21.751 "is_configured": false, 00:31:21.751 "data_offset": 2048, 00:31:21.751 "data_size": 63488 00:31:21.751 }, 00:31:21.751 { 00:31:21.751 "name": null, 00:31:21.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:21.751 "is_configured": false, 00:31:21.751 "data_offset": 2048, 00:31:21.751 "data_size": 63488 00:31:21.751 }, 00:31:21.751 { 00:31:21.751 "name": "BaseBdev3", 00:31:21.751 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:21.751 "is_configured": true, 00:31:21.751 "data_offset": 2048, 00:31:21.751 "data_size": 63488 00:31:21.751 }, 00:31:21.751 { 00:31:21.751 "name": "BaseBdev4", 00:31:21.751 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:21.751 "is_configured": true, 00:31:21.751 "data_offset": 2048, 00:31:21.751 "data_size": 63488 00:31:21.752 } 00:31:21.752 ] 00:31:21.752 }' 00:31:21.752 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:21.752 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:21.752 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:21.752 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:21.752 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:31:22.010 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:31:22.269 [2024-07-23 17:25:17.600053] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:31:22.269 [2024-07-23 17:25:17.600105] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:22.269 [2024-07-23 17:25:17.600132] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11704b0 00:31:22.269 [2024-07-23 17:25:17.600145] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:22.269 [2024-07-23 17:25:17.600506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:22.269 [2024-07-23 17:25:17.600523] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:22.269 [2024-07-23 17:25:17.600592] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:31:22.269 [2024-07-23 17:25:17.600604] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:31:22.269 [2024-07-23 17:25:17.600614] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:22.269 BaseBdev1 00:31:22.269 17:25:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:31:23.203 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:23.203 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:23.203 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:23.203 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:23.203 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:23.203 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:23.204 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:23.204 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:23.204 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:23.204 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:23.463 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:23.463 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:23.463 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:23.463 "name": "raid_bdev1", 00:31:23.463 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:23.463 "strip_size_kb": 0, 00:31:23.463 "state": "online", 00:31:23.463 "raid_level": "raid1", 00:31:23.463 "superblock": true, 00:31:23.463 "num_base_bdevs": 4, 00:31:23.463 "num_base_bdevs_discovered": 2, 00:31:23.463 "num_base_bdevs_operational": 2, 00:31:23.463 "base_bdevs_list": [ 00:31:23.463 { 00:31:23.463 "name": null, 00:31:23.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:23.463 "is_configured": false, 00:31:23.463 "data_offset": 2048, 00:31:23.463 "data_size": 63488 00:31:23.463 }, 00:31:23.463 { 00:31:23.463 "name": null, 00:31:23.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:23.463 "is_configured": false, 00:31:23.463 "data_offset": 2048, 00:31:23.463 "data_size": 63488 00:31:23.463 }, 00:31:23.463 { 00:31:23.463 "name": "BaseBdev3", 00:31:23.463 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:23.463 "is_configured": true, 00:31:23.463 "data_offset": 2048, 00:31:23.463 "data_size": 63488 00:31:23.463 }, 00:31:23.463 { 00:31:23.463 "name": "BaseBdev4", 00:31:23.463 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:23.463 "is_configured": true, 00:31:23.463 "data_offset": 2048, 00:31:23.463 "data_size": 63488 00:31:23.463 } 00:31:23.463 ] 00:31:23.463 }' 00:31:23.463 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:23.463 17:25:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:24.399 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:24.658 "name": "raid_bdev1", 00:31:24.658 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:24.658 "strip_size_kb": 0, 00:31:24.658 "state": "online", 00:31:24.658 "raid_level": "raid1", 00:31:24.658 "superblock": true, 00:31:24.658 "num_base_bdevs": 4, 00:31:24.658 "num_base_bdevs_discovered": 2, 00:31:24.658 "num_base_bdevs_operational": 2, 00:31:24.658 "base_bdevs_list": [ 00:31:24.658 { 00:31:24.658 "name": null, 00:31:24.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:24.658 "is_configured": false, 00:31:24.658 "data_offset": 2048, 00:31:24.658 "data_size": 63488 00:31:24.658 }, 00:31:24.658 { 00:31:24.658 "name": null, 00:31:24.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:24.658 "is_configured": false, 00:31:24.658 "data_offset": 2048, 00:31:24.658 "data_size": 63488 00:31:24.658 }, 00:31:24.658 { 00:31:24.658 "name": "BaseBdev3", 00:31:24.658 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:24.658 "is_configured": true, 00:31:24.658 "data_offset": 2048, 00:31:24.658 "data_size": 63488 00:31:24.658 }, 00:31:24.658 { 00:31:24.658 "name": "BaseBdev4", 00:31:24.658 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:24.658 "is_configured": true, 00:31:24.658 "data_offset": 2048, 00:31:24.658 "data_size": 63488 00:31:24.658 } 00:31:24.658 ] 00:31:24.658 }' 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:31:24.658 17:25:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:24.917 [2024-07-23 17:25:20.159200] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:24.917 [2024-07-23 17:25:20.159336] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:31:24.917 [2024-07-23 17:25:20.159353] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:24.917 request: 00:31:24.917 { 00:31:24.917 "base_bdev": "BaseBdev1", 00:31:24.917 "raid_bdev": "raid_bdev1", 00:31:24.917 "method": "bdev_raid_add_base_bdev", 00:31:24.917 "req_id": 1 00:31:24.917 } 00:31:24.917 Got JSON-RPC error response 00:31:24.917 response: 00:31:24.917 { 00:31:24.917 "code": -22, 00:31:24.917 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:31:24.917 } 00:31:24.917 17:25:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:31:24.917 17:25:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:31:24.917 17:25:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:31:24.917 17:25:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:31:24.917 17:25:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:25.852 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:26.110 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:26.110 "name": "raid_bdev1", 00:31:26.110 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:26.110 "strip_size_kb": 0, 00:31:26.110 "state": "online", 00:31:26.110 "raid_level": "raid1", 00:31:26.110 "superblock": true, 00:31:26.110 "num_base_bdevs": 4, 00:31:26.110 "num_base_bdevs_discovered": 2, 00:31:26.110 "num_base_bdevs_operational": 2, 00:31:26.110 "base_bdevs_list": [ 00:31:26.110 { 00:31:26.110 "name": null, 00:31:26.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:26.110 "is_configured": false, 00:31:26.110 "data_offset": 2048, 00:31:26.110 "data_size": 63488 00:31:26.110 }, 00:31:26.110 { 00:31:26.110 "name": null, 00:31:26.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:26.110 "is_configured": false, 00:31:26.110 "data_offset": 2048, 00:31:26.110 "data_size": 63488 00:31:26.110 }, 00:31:26.110 { 00:31:26.110 "name": "BaseBdev3", 00:31:26.110 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:26.110 "is_configured": true, 00:31:26.110 "data_offset": 2048, 00:31:26.110 "data_size": 63488 00:31:26.110 }, 00:31:26.110 { 00:31:26.110 "name": "BaseBdev4", 00:31:26.110 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:26.110 "is_configured": true, 00:31:26.110 "data_offset": 2048, 00:31:26.110 "data_size": 63488 00:31:26.110 } 00:31:26.110 ] 00:31:26.110 }' 00:31:26.110 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:26.110 17:25:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:26.678 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:26.937 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:26.937 "name": "raid_bdev1", 00:31:26.937 "uuid": "bbd17d2c-32a6-4e2f-ad18-1ba7812c442e", 00:31:26.937 "strip_size_kb": 0, 00:31:26.937 "state": "online", 00:31:26.937 "raid_level": "raid1", 00:31:26.937 "superblock": true, 00:31:26.937 "num_base_bdevs": 4, 00:31:26.937 "num_base_bdevs_discovered": 2, 00:31:26.937 "num_base_bdevs_operational": 2, 00:31:26.937 "base_bdevs_list": [ 00:31:26.937 { 00:31:26.937 "name": null, 00:31:26.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:26.937 "is_configured": false, 00:31:26.937 "data_offset": 2048, 00:31:26.937 "data_size": 63488 00:31:26.937 }, 00:31:26.937 { 00:31:26.937 "name": null, 00:31:26.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:26.937 "is_configured": false, 00:31:26.937 "data_offset": 2048, 00:31:26.937 "data_size": 63488 00:31:26.937 }, 00:31:26.937 { 00:31:26.937 "name": "BaseBdev3", 00:31:26.937 "uuid": "902197ab-c665-55ac-b9c2-7b032670c20d", 00:31:26.937 "is_configured": true, 00:31:26.937 "data_offset": 2048, 00:31:26.937 "data_size": 63488 00:31:26.937 }, 00:31:26.937 { 00:31:26.937 "name": "BaseBdev4", 00:31:26.937 "uuid": "8aea5edd-76dd-518d-9d9c-84c3e15e6583", 00:31:26.937 "is_configured": true, 00:31:26.937 "data_offset": 2048, 00:31:26.937 "data_size": 63488 00:31:26.937 } 00:31:26.937 ] 00:31:26.937 }' 00:31:26.937 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 54082 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 54082 ']' 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 54082 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 54082 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 54082' 00:31:27.196 killing process with pid 54082 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 54082 00:31:27.196 Received shutdown signal, test time was about 27.280166 seconds 00:31:27.196 00:31:27.196 Latency(us) 00:31:27.196 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:27.196 =================================================================================================================== 00:31:27.196 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:27.196 [2024-07-23 17:25:22.477606] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:27.196 [2024-07-23 17:25:22.477715] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:27.196 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 54082 00:31:27.196 [2024-07-23 17:25:22.477780] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:27.196 [2024-07-23 17:25:22.477794] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbd000 name raid_bdev1, state offline 00:31:27.196 [2024-07-23 17:25:22.518172] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:27.455 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:31:27.455 00:31:27.455 real 0m32.879s 00:31:27.455 user 0m51.867s 00:31:27.455 sys 0m5.318s 00:31:27.455 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:27.455 17:25:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:31:27.455 ************************************ 00:31:27.455 END TEST raid_rebuild_test_sb_io 00:31:27.455 ************************************ 00:31:27.455 17:25:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:31:27.455 17:25:22 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:31:27.455 17:25:22 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:31:27.455 17:25:22 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:31:27.455 17:25:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:31:27.455 17:25:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:27.455 17:25:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:27.455 ************************************ 00:31:27.455 START TEST raid_state_function_test_sb_4k 00:31:27.455 ************************************ 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=58763 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 58763' 00:31:27.455 Process raid pid: 58763 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 58763 /var/tmp/spdk-raid.sock 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 58763 ']' 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:27.455 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:27.455 17:25:22 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:27.714 [2024-07-23 17:25:22.883651] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:31:27.714 [2024-07-23 17:25:22.883719] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:27.714 [2024-07-23 17:25:23.015866] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:27.714 [2024-07-23 17:25:23.070635] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:27.714 [2024-07-23 17:25:23.129620] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:27.714 [2024-07-23 17:25:23.129646] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:27.973 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:27.973 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:31:27.973 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:31:28.230 [2024-07-23 17:25:23.560432] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:28.230 [2024-07-23 17:25:23.560473] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:28.230 [2024-07-23 17:25:23.560484] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:28.230 [2024-07-23 17:25:23.560496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:28.230 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:28.488 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:28.488 "name": "Existed_Raid", 00:31:28.488 "uuid": "73ee8668-0cee-4366-a8c5-fdee1d1f77ed", 00:31:28.488 "strip_size_kb": 0, 00:31:28.488 "state": "configuring", 00:31:28.488 "raid_level": "raid1", 00:31:28.488 "superblock": true, 00:31:28.488 "num_base_bdevs": 2, 00:31:28.488 "num_base_bdevs_discovered": 0, 00:31:28.488 "num_base_bdevs_operational": 2, 00:31:28.488 "base_bdevs_list": [ 00:31:28.488 { 00:31:28.488 "name": "BaseBdev1", 00:31:28.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:28.488 "is_configured": false, 00:31:28.488 "data_offset": 0, 00:31:28.488 "data_size": 0 00:31:28.488 }, 00:31:28.488 { 00:31:28.488 "name": "BaseBdev2", 00:31:28.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:28.488 "is_configured": false, 00:31:28.488 "data_offset": 0, 00:31:28.488 "data_size": 0 00:31:28.488 } 00:31:28.488 ] 00:31:28.488 }' 00:31:28.488 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:28.488 17:25:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:29.055 17:25:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:29.350 [2024-07-23 17:25:24.647174] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:29.350 [2024-07-23 17:25:24.647208] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc733f0 name Existed_Raid, state configuring 00:31:29.350 17:25:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:31:29.610 [2024-07-23 17:25:24.891835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:31:29.610 [2024-07-23 17:25:24.891876] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:31:29.610 [2024-07-23 17:25:24.891886] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:29.610 [2024-07-23 17:25:24.891915] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:29.610 17:25:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:31:30.177 [2024-07-23 17:25:25.396221] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:30.177 BaseBdev1 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:30.177 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:30.745 17:25:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:31:31.003 [ 00:31:31.003 { 00:31:31.003 "name": "BaseBdev1", 00:31:31.003 "aliases": [ 00:31:31.003 "48f99e17-de17-4549-829a-02bd80ca43c9" 00:31:31.003 ], 00:31:31.003 "product_name": "Malloc disk", 00:31:31.003 "block_size": 4096, 00:31:31.004 "num_blocks": 8192, 00:31:31.004 "uuid": "48f99e17-de17-4549-829a-02bd80ca43c9", 00:31:31.004 "assigned_rate_limits": { 00:31:31.004 "rw_ios_per_sec": 0, 00:31:31.004 "rw_mbytes_per_sec": 0, 00:31:31.004 "r_mbytes_per_sec": 0, 00:31:31.004 "w_mbytes_per_sec": 0 00:31:31.004 }, 00:31:31.004 "claimed": true, 00:31:31.004 "claim_type": "exclusive_write", 00:31:31.004 "zoned": false, 00:31:31.004 "supported_io_types": { 00:31:31.004 "read": true, 00:31:31.004 "write": true, 00:31:31.004 "unmap": true, 00:31:31.004 "flush": true, 00:31:31.004 "reset": true, 00:31:31.004 "nvme_admin": false, 00:31:31.004 "nvme_io": false, 00:31:31.004 "nvme_io_md": false, 00:31:31.004 "write_zeroes": true, 00:31:31.004 "zcopy": true, 00:31:31.004 "get_zone_info": false, 00:31:31.004 "zone_management": false, 00:31:31.004 "zone_append": false, 00:31:31.004 "compare": false, 00:31:31.004 "compare_and_write": false, 00:31:31.004 "abort": true, 00:31:31.004 "seek_hole": false, 00:31:31.004 "seek_data": false, 00:31:31.004 "copy": true, 00:31:31.004 "nvme_iov_md": false 00:31:31.004 }, 00:31:31.004 "memory_domains": [ 00:31:31.004 { 00:31:31.004 "dma_device_id": "system", 00:31:31.004 "dma_device_type": 1 00:31:31.004 }, 00:31:31.004 { 00:31:31.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:31.004 "dma_device_type": 2 00:31:31.004 } 00:31:31.004 ], 00:31:31.004 "driver_specific": {} 00:31:31.004 } 00:31:31.004 ] 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:31.004 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:31.262 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:31.262 "name": "Existed_Raid", 00:31:31.262 "uuid": "293e169b-95d1-4a4f-b434-5c60e4d3a3ef", 00:31:31.262 "strip_size_kb": 0, 00:31:31.262 "state": "configuring", 00:31:31.262 "raid_level": "raid1", 00:31:31.262 "superblock": true, 00:31:31.262 "num_base_bdevs": 2, 00:31:31.262 "num_base_bdevs_discovered": 1, 00:31:31.262 "num_base_bdevs_operational": 2, 00:31:31.262 "base_bdevs_list": [ 00:31:31.262 { 00:31:31.262 "name": "BaseBdev1", 00:31:31.262 "uuid": "48f99e17-de17-4549-829a-02bd80ca43c9", 00:31:31.262 "is_configured": true, 00:31:31.262 "data_offset": 256, 00:31:31.262 "data_size": 7936 00:31:31.262 }, 00:31:31.262 { 00:31:31.262 "name": "BaseBdev2", 00:31:31.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:31.262 "is_configured": false, 00:31:31.262 "data_offset": 0, 00:31:31.262 "data_size": 0 00:31:31.262 } 00:31:31.262 ] 00:31:31.262 }' 00:31:31.262 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:31.262 17:25:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:31.829 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:31:32.087 [2024-07-23 17:25:27.257205] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:31:32.087 [2024-07-23 17:25:27.257243] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc72d20 name Existed_Raid, state configuring 00:31:32.087 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:31:32.087 [2024-07-23 17:25:27.505904] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:32.087 [2024-07-23 17:25:27.507366] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:31:32.087 [2024-07-23 17:25:27.507400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:32.346 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:32.605 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:32.605 "name": "Existed_Raid", 00:31:32.605 "uuid": "28b4c734-f62c-40a7-acd8-320584b5454b", 00:31:32.605 "strip_size_kb": 0, 00:31:32.605 "state": "configuring", 00:31:32.605 "raid_level": "raid1", 00:31:32.605 "superblock": true, 00:31:32.605 "num_base_bdevs": 2, 00:31:32.605 "num_base_bdevs_discovered": 1, 00:31:32.605 "num_base_bdevs_operational": 2, 00:31:32.605 "base_bdevs_list": [ 00:31:32.605 { 00:31:32.605 "name": "BaseBdev1", 00:31:32.605 "uuid": "48f99e17-de17-4549-829a-02bd80ca43c9", 00:31:32.605 "is_configured": true, 00:31:32.605 "data_offset": 256, 00:31:32.605 "data_size": 7936 00:31:32.605 }, 00:31:32.605 { 00:31:32.605 "name": "BaseBdev2", 00:31:32.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:32.605 "is_configured": false, 00:31:32.605 "data_offset": 0, 00:31:32.605 "data_size": 0 00:31:32.605 } 00:31:32.605 ] 00:31:32.605 }' 00:31:32.605 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:32.605 17:25:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:33.170 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:31:33.429 [2024-07-23 17:25:28.628153] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:33.429 [2024-07-23 17:25:28.628299] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc72970 00:31:33.429 [2024-07-23 17:25:28.628312] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:31:33.429 [2024-07-23 17:25:28.628481] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc720c0 00:31:33.429 [2024-07-23 17:25:28.628600] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc72970 00:31:33.429 [2024-07-23 17:25:28.628610] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc72970 00:31:33.429 [2024-07-23 17:25:28.628703] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:33.429 BaseBdev2 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:33.429 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:31:33.688 17:25:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:31:33.947 [ 00:31:33.947 { 00:31:33.947 "name": "BaseBdev2", 00:31:33.947 "aliases": [ 00:31:33.947 "dc8a326b-b5e6-450e-a67f-5ec9046942f1" 00:31:33.947 ], 00:31:33.947 "product_name": "Malloc disk", 00:31:33.947 "block_size": 4096, 00:31:33.947 "num_blocks": 8192, 00:31:33.947 "uuid": "dc8a326b-b5e6-450e-a67f-5ec9046942f1", 00:31:33.947 "assigned_rate_limits": { 00:31:33.947 "rw_ios_per_sec": 0, 00:31:33.947 "rw_mbytes_per_sec": 0, 00:31:33.947 "r_mbytes_per_sec": 0, 00:31:33.947 "w_mbytes_per_sec": 0 00:31:33.947 }, 00:31:33.947 "claimed": true, 00:31:33.947 "claim_type": "exclusive_write", 00:31:33.947 "zoned": false, 00:31:33.947 "supported_io_types": { 00:31:33.947 "read": true, 00:31:33.947 "write": true, 00:31:33.947 "unmap": true, 00:31:33.947 "flush": true, 00:31:33.947 "reset": true, 00:31:33.947 "nvme_admin": false, 00:31:33.947 "nvme_io": false, 00:31:33.947 "nvme_io_md": false, 00:31:33.947 "write_zeroes": true, 00:31:33.947 "zcopy": true, 00:31:33.947 "get_zone_info": false, 00:31:33.947 "zone_management": false, 00:31:33.947 "zone_append": false, 00:31:33.947 "compare": false, 00:31:33.947 "compare_and_write": false, 00:31:33.947 "abort": true, 00:31:33.947 "seek_hole": false, 00:31:33.947 "seek_data": false, 00:31:33.947 "copy": true, 00:31:33.947 "nvme_iov_md": false 00:31:33.947 }, 00:31:33.947 "memory_domains": [ 00:31:33.947 { 00:31:33.947 "dma_device_id": "system", 00:31:33.947 "dma_device_type": 1 00:31:33.947 }, 00:31:33.947 { 00:31:33.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:33.947 "dma_device_type": 2 00:31:33.947 } 00:31:33.947 ], 00:31:33.947 "driver_specific": {} 00:31:33.947 } 00:31:33.947 ] 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:33.947 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:34.206 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:34.206 "name": "Existed_Raid", 00:31:34.206 "uuid": "28b4c734-f62c-40a7-acd8-320584b5454b", 00:31:34.206 "strip_size_kb": 0, 00:31:34.206 "state": "online", 00:31:34.206 "raid_level": "raid1", 00:31:34.206 "superblock": true, 00:31:34.206 "num_base_bdevs": 2, 00:31:34.206 "num_base_bdevs_discovered": 2, 00:31:34.206 "num_base_bdevs_operational": 2, 00:31:34.206 "base_bdevs_list": [ 00:31:34.206 { 00:31:34.206 "name": "BaseBdev1", 00:31:34.206 "uuid": "48f99e17-de17-4549-829a-02bd80ca43c9", 00:31:34.206 "is_configured": true, 00:31:34.206 "data_offset": 256, 00:31:34.206 "data_size": 7936 00:31:34.206 }, 00:31:34.206 { 00:31:34.206 "name": "BaseBdev2", 00:31:34.206 "uuid": "dc8a326b-b5e6-450e-a67f-5ec9046942f1", 00:31:34.206 "is_configured": true, 00:31:34.206 "data_offset": 256, 00:31:34.206 "data_size": 7936 00:31:34.206 } 00:31:34.206 ] 00:31:34.206 }' 00:31:34.206 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:34.206 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:31:34.773 17:25:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:35.033 [2024-07-23 17:25:30.220645] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:35.033 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:35.033 "name": "Existed_Raid", 00:31:35.033 "aliases": [ 00:31:35.033 "28b4c734-f62c-40a7-acd8-320584b5454b" 00:31:35.033 ], 00:31:35.033 "product_name": "Raid Volume", 00:31:35.033 "block_size": 4096, 00:31:35.033 "num_blocks": 7936, 00:31:35.033 "uuid": "28b4c734-f62c-40a7-acd8-320584b5454b", 00:31:35.033 "assigned_rate_limits": { 00:31:35.033 "rw_ios_per_sec": 0, 00:31:35.033 "rw_mbytes_per_sec": 0, 00:31:35.033 "r_mbytes_per_sec": 0, 00:31:35.033 "w_mbytes_per_sec": 0 00:31:35.033 }, 00:31:35.033 "claimed": false, 00:31:35.033 "zoned": false, 00:31:35.033 "supported_io_types": { 00:31:35.033 "read": true, 00:31:35.033 "write": true, 00:31:35.033 "unmap": false, 00:31:35.033 "flush": false, 00:31:35.033 "reset": true, 00:31:35.033 "nvme_admin": false, 00:31:35.033 "nvme_io": false, 00:31:35.033 "nvme_io_md": false, 00:31:35.033 "write_zeroes": true, 00:31:35.033 "zcopy": false, 00:31:35.033 "get_zone_info": false, 00:31:35.033 "zone_management": false, 00:31:35.033 "zone_append": false, 00:31:35.033 "compare": false, 00:31:35.033 "compare_and_write": false, 00:31:35.033 "abort": false, 00:31:35.033 "seek_hole": false, 00:31:35.033 "seek_data": false, 00:31:35.033 "copy": false, 00:31:35.033 "nvme_iov_md": false 00:31:35.033 }, 00:31:35.033 "memory_domains": [ 00:31:35.033 { 00:31:35.033 "dma_device_id": "system", 00:31:35.033 "dma_device_type": 1 00:31:35.033 }, 00:31:35.033 { 00:31:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:35.033 "dma_device_type": 2 00:31:35.033 }, 00:31:35.033 { 00:31:35.033 "dma_device_id": "system", 00:31:35.033 "dma_device_type": 1 00:31:35.033 }, 00:31:35.033 { 00:31:35.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:35.033 "dma_device_type": 2 00:31:35.033 } 00:31:35.033 ], 00:31:35.033 "driver_specific": { 00:31:35.033 "raid": { 00:31:35.033 "uuid": "28b4c734-f62c-40a7-acd8-320584b5454b", 00:31:35.033 "strip_size_kb": 0, 00:31:35.033 "state": "online", 00:31:35.033 "raid_level": "raid1", 00:31:35.033 "superblock": true, 00:31:35.033 "num_base_bdevs": 2, 00:31:35.033 "num_base_bdevs_discovered": 2, 00:31:35.033 "num_base_bdevs_operational": 2, 00:31:35.033 "base_bdevs_list": [ 00:31:35.033 { 00:31:35.033 "name": "BaseBdev1", 00:31:35.033 "uuid": "48f99e17-de17-4549-829a-02bd80ca43c9", 00:31:35.033 "is_configured": true, 00:31:35.033 "data_offset": 256, 00:31:35.033 "data_size": 7936 00:31:35.033 }, 00:31:35.033 { 00:31:35.033 "name": "BaseBdev2", 00:31:35.033 "uuid": "dc8a326b-b5e6-450e-a67f-5ec9046942f1", 00:31:35.033 "is_configured": true, 00:31:35.033 "data_offset": 256, 00:31:35.033 "data_size": 7936 00:31:35.033 } 00:31:35.033 ] 00:31:35.033 } 00:31:35.033 } 00:31:35.033 }' 00:31:35.033 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:35.033 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:31:35.033 BaseBdev2' 00:31:35.033 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:35.033 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:31:35.033 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:35.604 "name": "BaseBdev1", 00:31:35.604 "aliases": [ 00:31:35.604 "48f99e17-de17-4549-829a-02bd80ca43c9" 00:31:35.604 ], 00:31:35.604 "product_name": "Malloc disk", 00:31:35.604 "block_size": 4096, 00:31:35.604 "num_blocks": 8192, 00:31:35.604 "uuid": "48f99e17-de17-4549-829a-02bd80ca43c9", 00:31:35.604 "assigned_rate_limits": { 00:31:35.604 "rw_ios_per_sec": 0, 00:31:35.604 "rw_mbytes_per_sec": 0, 00:31:35.604 "r_mbytes_per_sec": 0, 00:31:35.604 "w_mbytes_per_sec": 0 00:31:35.604 }, 00:31:35.604 "claimed": true, 00:31:35.604 "claim_type": "exclusive_write", 00:31:35.604 "zoned": false, 00:31:35.604 "supported_io_types": { 00:31:35.604 "read": true, 00:31:35.604 "write": true, 00:31:35.604 "unmap": true, 00:31:35.604 "flush": true, 00:31:35.604 "reset": true, 00:31:35.604 "nvme_admin": false, 00:31:35.604 "nvme_io": false, 00:31:35.604 "nvme_io_md": false, 00:31:35.604 "write_zeroes": true, 00:31:35.604 "zcopy": true, 00:31:35.604 "get_zone_info": false, 00:31:35.604 "zone_management": false, 00:31:35.604 "zone_append": false, 00:31:35.604 "compare": false, 00:31:35.604 "compare_and_write": false, 00:31:35.604 "abort": true, 00:31:35.604 "seek_hole": false, 00:31:35.604 "seek_data": false, 00:31:35.604 "copy": true, 00:31:35.604 "nvme_iov_md": false 00:31:35.604 }, 00:31:35.604 "memory_domains": [ 00:31:35.604 { 00:31:35.604 "dma_device_id": "system", 00:31:35.604 "dma_device_type": 1 00:31:35.604 }, 00:31:35.604 { 00:31:35.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:35.604 "dma_device_type": 2 00:31:35.604 } 00:31:35.604 ], 00:31:35.604 "driver_specific": {} 00:31:35.604 }' 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:35.604 17:25:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:31:35.863 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:36.122 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:36.122 "name": "BaseBdev2", 00:31:36.122 "aliases": [ 00:31:36.122 "dc8a326b-b5e6-450e-a67f-5ec9046942f1" 00:31:36.122 ], 00:31:36.122 "product_name": "Malloc disk", 00:31:36.122 "block_size": 4096, 00:31:36.122 "num_blocks": 8192, 00:31:36.122 "uuid": "dc8a326b-b5e6-450e-a67f-5ec9046942f1", 00:31:36.122 "assigned_rate_limits": { 00:31:36.122 "rw_ios_per_sec": 0, 00:31:36.122 "rw_mbytes_per_sec": 0, 00:31:36.122 "r_mbytes_per_sec": 0, 00:31:36.122 "w_mbytes_per_sec": 0 00:31:36.122 }, 00:31:36.122 "claimed": true, 00:31:36.122 "claim_type": "exclusive_write", 00:31:36.122 "zoned": false, 00:31:36.122 "supported_io_types": { 00:31:36.122 "read": true, 00:31:36.122 "write": true, 00:31:36.122 "unmap": true, 00:31:36.122 "flush": true, 00:31:36.122 "reset": true, 00:31:36.122 "nvme_admin": false, 00:31:36.122 "nvme_io": false, 00:31:36.122 "nvme_io_md": false, 00:31:36.122 "write_zeroes": true, 00:31:36.122 "zcopy": true, 00:31:36.122 "get_zone_info": false, 00:31:36.122 "zone_management": false, 00:31:36.122 "zone_append": false, 00:31:36.122 "compare": false, 00:31:36.122 "compare_and_write": false, 00:31:36.122 "abort": true, 00:31:36.122 "seek_hole": false, 00:31:36.122 "seek_data": false, 00:31:36.122 "copy": true, 00:31:36.122 "nvme_iov_md": false 00:31:36.122 }, 00:31:36.122 "memory_domains": [ 00:31:36.122 { 00:31:36.122 "dma_device_id": "system", 00:31:36.122 "dma_device_type": 1 00:31:36.122 }, 00:31:36.122 { 00:31:36.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:36.122 "dma_device_type": 2 00:31:36.122 } 00:31:36.122 ], 00:31:36.122 "driver_specific": {} 00:31:36.122 }' 00:31:36.122 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:36.122 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:36.122 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:31:36.122 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:36.380 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:36.639 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:36.639 17:25:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:31:36.639 [2024-07-23 17:25:32.021211] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:36.639 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:31:36.898 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:36.898 "name": "Existed_Raid", 00:31:36.898 "uuid": "28b4c734-f62c-40a7-acd8-320584b5454b", 00:31:36.898 "strip_size_kb": 0, 00:31:36.898 "state": "online", 00:31:36.898 "raid_level": "raid1", 00:31:36.898 "superblock": true, 00:31:36.898 "num_base_bdevs": 2, 00:31:36.898 "num_base_bdevs_discovered": 1, 00:31:36.898 "num_base_bdevs_operational": 1, 00:31:36.898 "base_bdevs_list": [ 00:31:36.898 { 00:31:36.898 "name": null, 00:31:36.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:36.898 "is_configured": false, 00:31:36.898 "data_offset": 256, 00:31:36.898 "data_size": 7936 00:31:36.898 }, 00:31:36.898 { 00:31:36.898 "name": "BaseBdev2", 00:31:36.898 "uuid": "dc8a326b-b5e6-450e-a67f-5ec9046942f1", 00:31:36.898 "is_configured": true, 00:31:36.898 "data_offset": 256, 00:31:36.898 "data_size": 7936 00:31:36.898 } 00:31:36.898 ] 00:31:36.898 }' 00:31:36.898 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:36.898 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:37.833 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:31:37.833 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:37.833 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:37.833 17:25:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:31:37.833 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:31:37.833 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:31:37.833 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:31:38.092 [2024-07-23 17:25:33.366127] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:31:38.092 [2024-07-23 17:25:33.366217] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:38.092 [2024-07-23 17:25:33.378852] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:38.092 [2024-07-23 17:25:33.378889] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:38.092 [2024-07-23 17:25:33.378908] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc72970 name Existed_Raid, state offline 00:31:38.092 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:31:38.092 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:31:38.092 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:38.092 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 58763 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 58763 ']' 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 58763 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 58763 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 58763' 00:31:38.350 killing process with pid 58763 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 58763 00:31:38.350 [2024-07-23 17:25:33.714752] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:38.350 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 58763 00:31:38.350 [2024-07-23 17:25:33.715654] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:38.609 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:31:38.609 00:31:38.609 real 0m11.096s 00:31:38.609 user 0m20.234s 00:31:38.609 sys 0m2.129s 00:31:38.609 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:38.609 17:25:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:38.609 ************************************ 00:31:38.609 END TEST raid_state_function_test_sb_4k 00:31:38.609 ************************************ 00:31:38.609 17:25:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:31:38.609 17:25:33 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:31:38.609 17:25:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:31:38.609 17:25:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:38.609 17:25:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:38.609 ************************************ 00:31:38.609 START TEST raid_superblock_test_4k 00:31:38.609 ************************************ 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:31:38.609 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=60393 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 60393 /var/tmp/spdk-raid.sock 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 60393 ']' 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:38.610 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:38.610 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:38.868 [2024-07-23 17:25:34.072448] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:31:38.868 [2024-07-23 17:25:34.072514] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid60393 ] 00:31:38.868 [2024-07-23 17:25:34.205656] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:38.868 [2024-07-23 17:25:34.255832] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:39.127 [2024-07-23 17:25:34.309549] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:39.127 [2024-07-23 17:25:34.309580] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:39.127 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:31:39.386 malloc1 00:31:39.386 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:39.645 [2024-07-23 17:25:34.828656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:39.645 [2024-07-23 17:25:34.828705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:39.645 [2024-07-23 17:25:34.828728] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad6070 00:31:39.645 [2024-07-23 17:25:34.828741] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:39.645 [2024-07-23 17:25:34.830368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:39.645 [2024-07-23 17:25:34.830397] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:39.645 pt1 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:31:39.645 17:25:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:31:39.904 malloc2 00:31:39.904 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:39.904 [2024-07-23 17:25:35.319913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:39.904 [2024-07-23 17:25:35.319959] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:39.904 [2024-07-23 17:25:35.319977] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9bc920 00:31:39.904 [2024-07-23 17:25:35.319990] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:39.904 [2024-07-23 17:25:35.321698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:39.904 [2024-07-23 17:25:35.321727] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:39.904 pt2 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:31:40.164 [2024-07-23 17:25:35.564576] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:40.164 [2024-07-23 17:25:35.565927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:40.164 [2024-07-23 17:25:35.566070] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xace3e0 00:31:40.164 [2024-07-23 17:25:35.566088] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:31:40.164 [2024-07-23 17:25:35.566277] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xacf280 00:31:40.164 [2024-07-23 17:25:35.566421] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xace3e0 00:31:40.164 [2024-07-23 17:25:35.566432] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xace3e0 00:31:40.164 [2024-07-23 17:25:35.566529] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:40.164 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:40.423 "name": "raid_bdev1", 00:31:40.423 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:40.423 "strip_size_kb": 0, 00:31:40.423 "state": "online", 00:31:40.423 "raid_level": "raid1", 00:31:40.423 "superblock": true, 00:31:40.423 "num_base_bdevs": 2, 00:31:40.423 "num_base_bdevs_discovered": 2, 00:31:40.423 "num_base_bdevs_operational": 2, 00:31:40.423 "base_bdevs_list": [ 00:31:40.423 { 00:31:40.423 "name": "pt1", 00:31:40.423 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:40.423 "is_configured": true, 00:31:40.423 "data_offset": 256, 00:31:40.423 "data_size": 7936 00:31:40.423 }, 00:31:40.423 { 00:31:40.423 "name": "pt2", 00:31:40.423 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:40.423 "is_configured": true, 00:31:40.423 "data_offset": 256, 00:31:40.423 "data_size": 7936 00:31:40.423 } 00:31:40.423 ] 00:31:40.423 }' 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:40.423 17:25:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:41.359 [2024-07-23 17:25:36.683740] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:41.359 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:41.359 "name": "raid_bdev1", 00:31:41.359 "aliases": [ 00:31:41.359 "ae873dba-5cf8-4a31-a774-2dcd9b34b490" 00:31:41.359 ], 00:31:41.359 "product_name": "Raid Volume", 00:31:41.359 "block_size": 4096, 00:31:41.359 "num_blocks": 7936, 00:31:41.359 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:41.359 "assigned_rate_limits": { 00:31:41.359 "rw_ios_per_sec": 0, 00:31:41.359 "rw_mbytes_per_sec": 0, 00:31:41.359 "r_mbytes_per_sec": 0, 00:31:41.359 "w_mbytes_per_sec": 0 00:31:41.359 }, 00:31:41.359 "claimed": false, 00:31:41.359 "zoned": false, 00:31:41.359 "supported_io_types": { 00:31:41.359 "read": true, 00:31:41.359 "write": true, 00:31:41.359 "unmap": false, 00:31:41.359 "flush": false, 00:31:41.359 "reset": true, 00:31:41.359 "nvme_admin": false, 00:31:41.359 "nvme_io": false, 00:31:41.359 "nvme_io_md": false, 00:31:41.359 "write_zeroes": true, 00:31:41.359 "zcopy": false, 00:31:41.360 "get_zone_info": false, 00:31:41.360 "zone_management": false, 00:31:41.360 "zone_append": false, 00:31:41.360 "compare": false, 00:31:41.360 "compare_and_write": false, 00:31:41.360 "abort": false, 00:31:41.360 "seek_hole": false, 00:31:41.360 "seek_data": false, 00:31:41.360 "copy": false, 00:31:41.360 "nvme_iov_md": false 00:31:41.360 }, 00:31:41.360 "memory_domains": [ 00:31:41.360 { 00:31:41.360 "dma_device_id": "system", 00:31:41.360 "dma_device_type": 1 00:31:41.360 }, 00:31:41.360 { 00:31:41.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:41.360 "dma_device_type": 2 00:31:41.360 }, 00:31:41.360 { 00:31:41.360 "dma_device_id": "system", 00:31:41.360 "dma_device_type": 1 00:31:41.360 }, 00:31:41.360 { 00:31:41.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:41.360 "dma_device_type": 2 00:31:41.360 } 00:31:41.360 ], 00:31:41.360 "driver_specific": { 00:31:41.360 "raid": { 00:31:41.360 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:41.360 "strip_size_kb": 0, 00:31:41.360 "state": "online", 00:31:41.360 "raid_level": "raid1", 00:31:41.360 "superblock": true, 00:31:41.360 "num_base_bdevs": 2, 00:31:41.360 "num_base_bdevs_discovered": 2, 00:31:41.360 "num_base_bdevs_operational": 2, 00:31:41.360 "base_bdevs_list": [ 00:31:41.360 { 00:31:41.360 "name": "pt1", 00:31:41.360 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:41.360 "is_configured": true, 00:31:41.360 "data_offset": 256, 00:31:41.360 "data_size": 7936 00:31:41.360 }, 00:31:41.360 { 00:31:41.360 "name": "pt2", 00:31:41.360 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:41.360 "is_configured": true, 00:31:41.360 "data_offset": 256, 00:31:41.360 "data_size": 7936 00:31:41.360 } 00:31:41.360 ] 00:31:41.360 } 00:31:41.360 } 00:31:41.360 }' 00:31:41.360 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:41.360 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:31:41.360 pt2' 00:31:41.360 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:41.360 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:31:41.360 17:25:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:41.618 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:41.618 "name": "pt1", 00:31:41.618 "aliases": [ 00:31:41.618 "00000000-0000-0000-0000-000000000001" 00:31:41.618 ], 00:31:41.618 "product_name": "passthru", 00:31:41.618 "block_size": 4096, 00:31:41.619 "num_blocks": 8192, 00:31:41.619 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:41.619 "assigned_rate_limits": { 00:31:41.619 "rw_ios_per_sec": 0, 00:31:41.619 "rw_mbytes_per_sec": 0, 00:31:41.619 "r_mbytes_per_sec": 0, 00:31:41.619 "w_mbytes_per_sec": 0 00:31:41.619 }, 00:31:41.619 "claimed": true, 00:31:41.619 "claim_type": "exclusive_write", 00:31:41.619 "zoned": false, 00:31:41.619 "supported_io_types": { 00:31:41.619 "read": true, 00:31:41.619 "write": true, 00:31:41.619 "unmap": true, 00:31:41.619 "flush": true, 00:31:41.619 "reset": true, 00:31:41.619 "nvme_admin": false, 00:31:41.619 "nvme_io": false, 00:31:41.619 "nvme_io_md": false, 00:31:41.619 "write_zeroes": true, 00:31:41.619 "zcopy": true, 00:31:41.619 "get_zone_info": false, 00:31:41.619 "zone_management": false, 00:31:41.619 "zone_append": false, 00:31:41.619 "compare": false, 00:31:41.619 "compare_and_write": false, 00:31:41.619 "abort": true, 00:31:41.619 "seek_hole": false, 00:31:41.619 "seek_data": false, 00:31:41.619 "copy": true, 00:31:41.619 "nvme_iov_md": false 00:31:41.619 }, 00:31:41.619 "memory_domains": [ 00:31:41.619 { 00:31:41.619 "dma_device_id": "system", 00:31:41.619 "dma_device_type": 1 00:31:41.619 }, 00:31:41.619 { 00:31:41.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:41.619 "dma_device_type": 2 00:31:41.619 } 00:31:41.619 ], 00:31:41.619 "driver_specific": { 00:31:41.619 "passthru": { 00:31:41.619 "name": "pt1", 00:31:41.619 "base_bdev_name": "malloc1" 00:31:41.619 } 00:31:41.619 } 00:31:41.619 }' 00:31:41.619 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:41.877 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:31:42.136 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:42.395 "name": "pt2", 00:31:42.395 "aliases": [ 00:31:42.395 "00000000-0000-0000-0000-000000000002" 00:31:42.395 ], 00:31:42.395 "product_name": "passthru", 00:31:42.395 "block_size": 4096, 00:31:42.395 "num_blocks": 8192, 00:31:42.395 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:42.395 "assigned_rate_limits": { 00:31:42.395 "rw_ios_per_sec": 0, 00:31:42.395 "rw_mbytes_per_sec": 0, 00:31:42.395 "r_mbytes_per_sec": 0, 00:31:42.395 "w_mbytes_per_sec": 0 00:31:42.395 }, 00:31:42.395 "claimed": true, 00:31:42.395 "claim_type": "exclusive_write", 00:31:42.395 "zoned": false, 00:31:42.395 "supported_io_types": { 00:31:42.395 "read": true, 00:31:42.395 "write": true, 00:31:42.395 "unmap": true, 00:31:42.395 "flush": true, 00:31:42.395 "reset": true, 00:31:42.395 "nvme_admin": false, 00:31:42.395 "nvme_io": false, 00:31:42.395 "nvme_io_md": false, 00:31:42.395 "write_zeroes": true, 00:31:42.395 "zcopy": true, 00:31:42.395 "get_zone_info": false, 00:31:42.395 "zone_management": false, 00:31:42.395 "zone_append": false, 00:31:42.395 "compare": false, 00:31:42.395 "compare_and_write": false, 00:31:42.395 "abort": true, 00:31:42.395 "seek_hole": false, 00:31:42.395 "seek_data": false, 00:31:42.395 "copy": true, 00:31:42.395 "nvme_iov_md": false 00:31:42.395 }, 00:31:42.395 "memory_domains": [ 00:31:42.395 { 00:31:42.395 "dma_device_id": "system", 00:31:42.395 "dma_device_type": 1 00:31:42.395 }, 00:31:42.395 { 00:31:42.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:42.395 "dma_device_type": 2 00:31:42.395 } 00:31:42.395 ], 00:31:42.395 "driver_specific": { 00:31:42.395 "passthru": { 00:31:42.395 "name": "pt2", 00:31:42.395 "base_bdev_name": "malloc2" 00:31:42.395 } 00:31:42.395 } 00:31:42.395 }' 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:42.395 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:42.653 17:25:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:31:42.911 [2024-07-23 17:25:38.207956] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:42.911 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ae873dba-5cf8-4a31-a774-2dcd9b34b490 00:31:42.911 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z ae873dba-5cf8-4a31-a774-2dcd9b34b490 ']' 00:31:42.911 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:43.169 [2024-07-23 17:25:38.460372] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:43.169 [2024-07-23 17:25:38.460392] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:43.169 [2024-07-23 17:25:38.460446] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:43.169 [2024-07-23 17:25:38.460497] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:43.169 [2024-07-23 17:25:38.460508] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xace3e0 name raid_bdev1, state offline 00:31:43.169 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:43.169 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:31:43.427 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:31:43.427 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:31:43.427 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:43.427 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:31:43.707 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:31:43.707 17:25:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:31:43.982 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:31:43.982 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:31:44.240 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:31:44.240 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:31:44.240 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:31:44.240 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:31:44.240 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:44.240 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:31:44.241 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:31:44.500 [2024-07-23 17:25:39.707621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:31:44.500 [2024-07-23 17:25:39.708943] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:31:44.500 [2024-07-23 17:25:39.708997] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:31:44.500 [2024-07-23 17:25:39.709036] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:31:44.500 [2024-07-23 17:25:39.709054] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:44.500 [2024-07-23 17:25:39.709063] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9bc090 name raid_bdev1, state configuring 00:31:44.500 request: 00:31:44.500 { 00:31:44.500 "name": "raid_bdev1", 00:31:44.500 "raid_level": "raid1", 00:31:44.500 "base_bdevs": [ 00:31:44.500 "malloc1", 00:31:44.500 "malloc2" 00:31:44.500 ], 00:31:44.500 "superblock": false, 00:31:44.500 "method": "bdev_raid_create", 00:31:44.500 "req_id": 1 00:31:44.500 } 00:31:44.500 Got JSON-RPC error response 00:31:44.500 response: 00:31:44.500 { 00:31:44.500 "code": -17, 00:31:44.500 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:31:44.500 } 00:31:44.500 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:31:44.500 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:31:44.500 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:31:44.500 17:25:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:31:44.500 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:44.500 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:31:44.758 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:31:44.758 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:31:44.758 17:25:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:45.017 [2024-07-23 17:25:40.184848] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:45.017 [2024-07-23 17:25:40.184904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:45.017 [2024-07-23 17:25:40.184923] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad71a0 00:31:45.017 [2024-07-23 17:25:40.184935] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:45.017 [2024-07-23 17:25:40.186504] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:45.017 [2024-07-23 17:25:40.186532] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:45.017 [2024-07-23 17:25:40.186599] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:31:45.017 [2024-07-23 17:25:40.186623] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:45.017 pt1 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:45.017 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:45.276 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:45.276 "name": "raid_bdev1", 00:31:45.276 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:45.276 "strip_size_kb": 0, 00:31:45.276 "state": "configuring", 00:31:45.276 "raid_level": "raid1", 00:31:45.276 "superblock": true, 00:31:45.276 "num_base_bdevs": 2, 00:31:45.276 "num_base_bdevs_discovered": 1, 00:31:45.276 "num_base_bdevs_operational": 2, 00:31:45.276 "base_bdevs_list": [ 00:31:45.276 { 00:31:45.276 "name": "pt1", 00:31:45.276 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:45.276 "is_configured": true, 00:31:45.276 "data_offset": 256, 00:31:45.276 "data_size": 7936 00:31:45.276 }, 00:31:45.276 { 00:31:45.276 "name": null, 00:31:45.276 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:45.276 "is_configured": false, 00:31:45.276 "data_offset": 256, 00:31:45.276 "data_size": 7936 00:31:45.276 } 00:31:45.276 ] 00:31:45.276 }' 00:31:45.276 17:25:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:45.276 17:25:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:45.843 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:31:45.843 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:31:45.843 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:45.843 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:46.101 [2024-07-23 17:25:41.287773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:46.101 [2024-07-23 17:25:41.287820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:46.101 [2024-07-23 17:25:41.287839] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9258a0 00:31:46.101 [2024-07-23 17:25:41.287851] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:46.101 [2024-07-23 17:25:41.288195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:46.101 [2024-07-23 17:25:41.288213] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:46.101 [2024-07-23 17:25:41.288272] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:31:46.101 [2024-07-23 17:25:41.288290] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:46.101 [2024-07-23 17:25:41.288385] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x925350 00:31:46.101 [2024-07-23 17:25:41.288396] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:31:46.101 [2024-07-23 17:25:41.288559] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x922f80 00:31:46.101 [2024-07-23 17:25:41.288687] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x925350 00:31:46.101 [2024-07-23 17:25:41.288697] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x925350 00:31:46.101 [2024-07-23 17:25:41.288790] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:46.101 pt2 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:46.101 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:46.360 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:46.360 "name": "raid_bdev1", 00:31:46.360 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:46.360 "strip_size_kb": 0, 00:31:46.360 "state": "online", 00:31:46.360 "raid_level": "raid1", 00:31:46.360 "superblock": true, 00:31:46.360 "num_base_bdevs": 2, 00:31:46.360 "num_base_bdevs_discovered": 2, 00:31:46.360 "num_base_bdevs_operational": 2, 00:31:46.360 "base_bdevs_list": [ 00:31:46.360 { 00:31:46.360 "name": "pt1", 00:31:46.360 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:46.360 "is_configured": true, 00:31:46.360 "data_offset": 256, 00:31:46.360 "data_size": 7936 00:31:46.360 }, 00:31:46.360 { 00:31:46.360 "name": "pt2", 00:31:46.360 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:46.360 "is_configured": true, 00:31:46.360 "data_offset": 256, 00:31:46.360 "data_size": 7936 00:31:46.360 } 00:31:46.360 ] 00:31:46.360 }' 00:31:46.360 17:25:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:46.360 17:25:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:46.928 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:31:47.187 [2024-07-23 17:25:42.390957] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:47.187 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:31:47.187 "name": "raid_bdev1", 00:31:47.187 "aliases": [ 00:31:47.187 "ae873dba-5cf8-4a31-a774-2dcd9b34b490" 00:31:47.187 ], 00:31:47.187 "product_name": "Raid Volume", 00:31:47.187 "block_size": 4096, 00:31:47.187 "num_blocks": 7936, 00:31:47.187 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:47.187 "assigned_rate_limits": { 00:31:47.187 "rw_ios_per_sec": 0, 00:31:47.187 "rw_mbytes_per_sec": 0, 00:31:47.187 "r_mbytes_per_sec": 0, 00:31:47.187 "w_mbytes_per_sec": 0 00:31:47.187 }, 00:31:47.187 "claimed": false, 00:31:47.187 "zoned": false, 00:31:47.187 "supported_io_types": { 00:31:47.187 "read": true, 00:31:47.187 "write": true, 00:31:47.187 "unmap": false, 00:31:47.187 "flush": false, 00:31:47.187 "reset": true, 00:31:47.187 "nvme_admin": false, 00:31:47.187 "nvme_io": false, 00:31:47.187 "nvme_io_md": false, 00:31:47.187 "write_zeroes": true, 00:31:47.187 "zcopy": false, 00:31:47.187 "get_zone_info": false, 00:31:47.187 "zone_management": false, 00:31:47.187 "zone_append": false, 00:31:47.187 "compare": false, 00:31:47.187 "compare_and_write": false, 00:31:47.187 "abort": false, 00:31:47.187 "seek_hole": false, 00:31:47.187 "seek_data": false, 00:31:47.187 "copy": false, 00:31:47.187 "nvme_iov_md": false 00:31:47.187 }, 00:31:47.187 "memory_domains": [ 00:31:47.187 { 00:31:47.187 "dma_device_id": "system", 00:31:47.187 "dma_device_type": 1 00:31:47.187 }, 00:31:47.187 { 00:31:47.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.187 "dma_device_type": 2 00:31:47.187 }, 00:31:47.187 { 00:31:47.187 "dma_device_id": "system", 00:31:47.187 "dma_device_type": 1 00:31:47.187 }, 00:31:47.187 { 00:31:47.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.187 "dma_device_type": 2 00:31:47.187 } 00:31:47.187 ], 00:31:47.187 "driver_specific": { 00:31:47.187 "raid": { 00:31:47.187 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:47.187 "strip_size_kb": 0, 00:31:47.187 "state": "online", 00:31:47.187 "raid_level": "raid1", 00:31:47.187 "superblock": true, 00:31:47.187 "num_base_bdevs": 2, 00:31:47.187 "num_base_bdevs_discovered": 2, 00:31:47.187 "num_base_bdevs_operational": 2, 00:31:47.187 "base_bdevs_list": [ 00:31:47.187 { 00:31:47.187 "name": "pt1", 00:31:47.187 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:47.187 "is_configured": true, 00:31:47.187 "data_offset": 256, 00:31:47.187 "data_size": 7936 00:31:47.187 }, 00:31:47.187 { 00:31:47.187 "name": "pt2", 00:31:47.187 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:47.187 "is_configured": true, 00:31:47.187 "data_offset": 256, 00:31:47.187 "data_size": 7936 00:31:47.187 } 00:31:47.187 ] 00:31:47.187 } 00:31:47.187 } 00:31:47.187 }' 00:31:47.187 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:31:47.187 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:31:47.187 pt2' 00:31:47.187 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:47.187 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:31:47.187 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:47.446 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:47.446 "name": "pt1", 00:31:47.446 "aliases": [ 00:31:47.446 "00000000-0000-0000-0000-000000000001" 00:31:47.446 ], 00:31:47.446 "product_name": "passthru", 00:31:47.446 "block_size": 4096, 00:31:47.446 "num_blocks": 8192, 00:31:47.446 "uuid": "00000000-0000-0000-0000-000000000001", 00:31:47.446 "assigned_rate_limits": { 00:31:47.446 "rw_ios_per_sec": 0, 00:31:47.446 "rw_mbytes_per_sec": 0, 00:31:47.446 "r_mbytes_per_sec": 0, 00:31:47.446 "w_mbytes_per_sec": 0 00:31:47.446 }, 00:31:47.446 "claimed": true, 00:31:47.446 "claim_type": "exclusive_write", 00:31:47.446 "zoned": false, 00:31:47.446 "supported_io_types": { 00:31:47.446 "read": true, 00:31:47.446 "write": true, 00:31:47.446 "unmap": true, 00:31:47.446 "flush": true, 00:31:47.446 "reset": true, 00:31:47.446 "nvme_admin": false, 00:31:47.447 "nvme_io": false, 00:31:47.447 "nvme_io_md": false, 00:31:47.447 "write_zeroes": true, 00:31:47.447 "zcopy": true, 00:31:47.447 "get_zone_info": false, 00:31:47.447 "zone_management": false, 00:31:47.447 "zone_append": false, 00:31:47.447 "compare": false, 00:31:47.447 "compare_and_write": false, 00:31:47.447 "abort": true, 00:31:47.447 "seek_hole": false, 00:31:47.447 "seek_data": false, 00:31:47.447 "copy": true, 00:31:47.447 "nvme_iov_md": false 00:31:47.447 }, 00:31:47.447 "memory_domains": [ 00:31:47.447 { 00:31:47.447 "dma_device_id": "system", 00:31:47.447 "dma_device_type": 1 00:31:47.447 }, 00:31:47.447 { 00:31:47.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:47.447 "dma_device_type": 2 00:31:47.447 } 00:31:47.447 ], 00:31:47.447 "driver_specific": { 00:31:47.447 "passthru": { 00:31:47.447 "name": "pt1", 00:31:47.447 "base_bdev_name": "malloc1" 00:31:47.447 } 00:31:47.447 } 00:31:47.447 }' 00:31:47.447 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:47.447 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:47.447 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:31:47.447 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:47.706 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:47.706 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:47.706 17:25:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:47.706 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:31:47.965 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:31:48.223 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:31:48.223 "name": "pt2", 00:31:48.223 "aliases": [ 00:31:48.223 "00000000-0000-0000-0000-000000000002" 00:31:48.223 ], 00:31:48.223 "product_name": "passthru", 00:31:48.223 "block_size": 4096, 00:31:48.223 "num_blocks": 8192, 00:31:48.223 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:48.223 "assigned_rate_limits": { 00:31:48.223 "rw_ios_per_sec": 0, 00:31:48.223 "rw_mbytes_per_sec": 0, 00:31:48.223 "r_mbytes_per_sec": 0, 00:31:48.223 "w_mbytes_per_sec": 0 00:31:48.223 }, 00:31:48.223 "claimed": true, 00:31:48.223 "claim_type": "exclusive_write", 00:31:48.223 "zoned": false, 00:31:48.223 "supported_io_types": { 00:31:48.223 "read": true, 00:31:48.223 "write": true, 00:31:48.223 "unmap": true, 00:31:48.223 "flush": true, 00:31:48.223 "reset": true, 00:31:48.223 "nvme_admin": false, 00:31:48.223 "nvme_io": false, 00:31:48.223 "nvme_io_md": false, 00:31:48.223 "write_zeroes": true, 00:31:48.223 "zcopy": true, 00:31:48.223 "get_zone_info": false, 00:31:48.223 "zone_management": false, 00:31:48.223 "zone_append": false, 00:31:48.223 "compare": false, 00:31:48.223 "compare_and_write": false, 00:31:48.223 "abort": true, 00:31:48.223 "seek_hole": false, 00:31:48.223 "seek_data": false, 00:31:48.223 "copy": true, 00:31:48.223 "nvme_iov_md": false 00:31:48.223 }, 00:31:48.223 "memory_domains": [ 00:31:48.223 { 00:31:48.223 "dma_device_id": "system", 00:31:48.223 "dma_device_type": 1 00:31:48.223 }, 00:31:48.223 { 00:31:48.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:48.223 "dma_device_type": 2 00:31:48.223 } 00:31:48.223 ], 00:31:48.223 "driver_specific": { 00:31:48.223 "passthru": { 00:31:48.223 "name": "pt2", 00:31:48.223 "base_bdev_name": "malloc2" 00:31:48.223 } 00:31:48.223 } 00:31:48.223 }' 00:31:48.223 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:48.223 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:31:48.223 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:31:48.223 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:48.481 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:31:48.481 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:31:48.481 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:48.481 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:31:48.481 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:31:48.481 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:48.739 17:25:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:31:48.739 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:31:48.739 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:48.739 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:31:49.306 [2024-07-23 17:25:44.516606] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:49.306 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' ae873dba-5cf8-4a31-a774-2dcd9b34b490 '!=' ae873dba-5cf8-4a31-a774-2dcd9b34b490 ']' 00:31:49.306 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:31:49.306 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:31:49.306 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:31:49.306 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:31:49.564 [2024-07-23 17:25:44.777064] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:49.565 17:25:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:50.132 17:25:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:50.132 "name": "raid_bdev1", 00:31:50.132 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:50.132 "strip_size_kb": 0, 00:31:50.132 "state": "online", 00:31:50.132 "raid_level": "raid1", 00:31:50.132 "superblock": true, 00:31:50.132 "num_base_bdevs": 2, 00:31:50.132 "num_base_bdevs_discovered": 1, 00:31:50.132 "num_base_bdevs_operational": 1, 00:31:50.132 "base_bdevs_list": [ 00:31:50.132 { 00:31:50.132 "name": null, 00:31:50.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:50.132 "is_configured": false, 00:31:50.132 "data_offset": 256, 00:31:50.132 "data_size": 7936 00:31:50.132 }, 00:31:50.132 { 00:31:50.132 "name": "pt2", 00:31:50.132 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:50.132 "is_configured": true, 00:31:50.132 "data_offset": 256, 00:31:50.132 "data_size": 7936 00:31:50.132 } 00:31:50.132 ] 00:31:50.132 }' 00:31:50.132 17:25:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:50.132 17:25:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:50.699 17:25:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:50.957 [2024-07-23 17:25:46.208817] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:50.957 [2024-07-23 17:25:46.208844] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:50.957 [2024-07-23 17:25:46.208902] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:50.957 [2024-07-23 17:25:46.208946] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:50.957 [2024-07-23 17:25:46.208957] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x925350 name raid_bdev1, state offline 00:31:50.957 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:50.957 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:31:51.215 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:31:51.473 [2024-07-23 17:25:46.738200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:31:51.473 [2024-07-23 17:25:46.738239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:51.473 [2024-07-23 17:25:46.738261] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad7850 00:31:51.473 [2024-07-23 17:25:46.738273] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:51.473 [2024-07-23 17:25:46.739821] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:51.473 [2024-07-23 17:25:46.739854] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:31:51.473 [2024-07-23 17:25:46.739922] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:31:51.473 [2024-07-23 17:25:46.739946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:51.473 [2024-07-23 17:25:46.740022] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xad0990 00:31:51.473 [2024-07-23 17:25:46.740032] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:31:51.473 [2024-07-23 17:25:46.740192] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad6300 00:31:51.473 [2024-07-23 17:25:46.740308] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad0990 00:31:51.473 [2024-07-23 17:25:46.740317] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xad0990 00:31:51.473 [2024-07-23 17:25:46.740408] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:51.473 pt2 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:51.473 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:51.732 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:51.732 "name": "raid_bdev1", 00:31:51.732 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:51.732 "strip_size_kb": 0, 00:31:51.732 "state": "online", 00:31:51.732 "raid_level": "raid1", 00:31:51.732 "superblock": true, 00:31:51.732 "num_base_bdevs": 2, 00:31:51.732 "num_base_bdevs_discovered": 1, 00:31:51.732 "num_base_bdevs_operational": 1, 00:31:51.732 "base_bdevs_list": [ 00:31:51.732 { 00:31:51.732 "name": null, 00:31:51.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:51.732 "is_configured": false, 00:31:51.732 "data_offset": 256, 00:31:51.732 "data_size": 7936 00:31:51.732 }, 00:31:51.732 { 00:31:51.732 "name": "pt2", 00:31:51.732 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:51.732 "is_configured": true, 00:31:51.732 "data_offset": 256, 00:31:51.732 "data_size": 7936 00:31:51.732 } 00:31:51.732 ] 00:31:51.732 }' 00:31:51.732 17:25:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:51.732 17:25:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:52.298 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:52.556 [2024-07-23 17:25:47.748874] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:52.556 [2024-07-23 17:25:47.748907] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:52.556 [2024-07-23 17:25:47.748957] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:52.556 [2024-07-23 17:25:47.748998] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:52.556 [2024-07-23 17:25:47.749009] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad0990 name raid_bdev1, state offline 00:31:52.556 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:52.556 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:31:52.556 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:31:52.556 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:31:52.556 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:31:52.556 17:25:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:31:52.814 [2024-07-23 17:25:48.105810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:31:52.814 [2024-07-23 17:25:48.105855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:52.814 [2024-07-23 17:25:48.105873] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x924a80 00:31:52.814 [2024-07-23 17:25:48.105885] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:52.814 [2024-07-23 17:25:48.107448] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:52.814 [2024-07-23 17:25:48.107475] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:31:52.814 [2024-07-23 17:25:48.107534] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:31:52.814 [2024-07-23 17:25:48.107556] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:31:52.814 [2024-07-23 17:25:48.107651] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:31:52.814 [2024-07-23 17:25:48.107664] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:52.815 [2024-07-23 17:25:48.107677] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad4690 name raid_bdev1, state configuring 00:31:52.815 [2024-07-23 17:25:48.107699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:31:52.815 [2024-07-23 17:25:48.107752] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xacff70 00:31:52.815 [2024-07-23 17:25:48.107762] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:31:52.815 [2024-07-23 17:25:48.107928] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad58d0 00:31:52.815 [2024-07-23 17:25:48.108045] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xacff70 00:31:52.815 [2024-07-23 17:25:48.108055] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xacff70 00:31:52.815 [2024-07-23 17:25:48.108143] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:52.815 pt1 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:52.815 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:53.073 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:53.073 "name": "raid_bdev1", 00:31:53.073 "uuid": "ae873dba-5cf8-4a31-a774-2dcd9b34b490", 00:31:53.073 "strip_size_kb": 0, 00:31:53.073 "state": "online", 00:31:53.073 "raid_level": "raid1", 00:31:53.073 "superblock": true, 00:31:53.073 "num_base_bdevs": 2, 00:31:53.073 "num_base_bdevs_discovered": 1, 00:31:53.073 "num_base_bdevs_operational": 1, 00:31:53.073 "base_bdevs_list": [ 00:31:53.073 { 00:31:53.073 "name": null, 00:31:53.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:53.073 "is_configured": false, 00:31:53.073 "data_offset": 256, 00:31:53.073 "data_size": 7936 00:31:53.073 }, 00:31:53.073 { 00:31:53.073 "name": "pt2", 00:31:53.074 "uuid": "00000000-0000-0000-0000-000000000002", 00:31:53.074 "is_configured": true, 00:31:53.074 "data_offset": 256, 00:31:53.074 "data_size": 7936 00:31:53.074 } 00:31:53.074 ] 00:31:53.074 }' 00:31:53.074 17:25:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:53.074 17:25:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:54.009 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:31:54.010 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:31:54.010 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:31:54.010 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:54.010 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:31:54.268 [2024-07-23 17:25:49.525795] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' ae873dba-5cf8-4a31-a774-2dcd9b34b490 '!=' ae873dba-5cf8-4a31-a774-2dcd9b34b490 ']' 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 60393 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 60393 ']' 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 60393 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 60393 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 60393' 00:31:54.268 killing process with pid 60393 00:31:54.268 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 60393 00:31:54.268 [2024-07-23 17:25:49.604284] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:54.268 [2024-07-23 17:25:49.604335] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:54.268 [2024-07-23 17:25:49.604376] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:54.269 [2024-07-23 17:25:49.604387] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xacff70 name raid_bdev1, state offline 00:31:54.269 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 60393 00:31:54.269 [2024-07-23 17:25:49.622190] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:54.527 17:25:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:31:54.527 00:31:54.527 real 0m15.825s 00:31:54.527 user 0m29.112s 00:31:54.527 sys 0m2.980s 00:31:54.527 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:54.527 17:25:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:31:54.527 ************************************ 00:31:54.527 END TEST raid_superblock_test_4k 00:31:54.527 ************************************ 00:31:54.527 17:25:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:31:54.527 17:25:49 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:31:54.527 17:25:49 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:31:54.527 17:25:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:54.527 17:25:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:54.527 17:25:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:54.527 ************************************ 00:31:54.527 START TEST raid_rebuild_test_sb_4k 00:31:54.527 ************************************ 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:31:54.527 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=62812 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 62812 /var/tmp/spdk-raid.sock 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 62812 ']' 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:31:54.528 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:54.528 17:25:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:54.786 [2024-07-23 17:25:49.995795] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:31:54.786 [2024-07-23 17:25:49.995858] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid62812 ] 00:31:54.786 I/O size of 3145728 is greater than zero copy threshold (65536). 00:31:54.786 Zero copy mechanism will not be used. 00:31:54.786 [2024-07-23 17:25:50.127196] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:54.786 [2024-07-23 17:25:50.176693] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:55.045 [2024-07-23 17:25:50.242483] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:55.045 [2024-07-23 17:25:50.242520] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:31:55.980 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:55.980 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:31:55.980 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:31:55.980 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:31:56.547 BaseBdev1_malloc 00:31:56.547 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:31:56.547 [2024-07-23 17:25:51.948350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:31:56.547 [2024-07-23 17:25:51.948395] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:56.547 [2024-07-23 17:25:51.948420] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21c2170 00:31:56.547 [2024-07-23 17:25:51.948432] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:56.547 [2024-07-23 17:25:51.950053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:56.547 [2024-07-23 17:25:51.950082] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:56.547 BaseBdev1 00:31:56.547 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:31:56.547 17:25:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:31:56.806 BaseBdev2_malloc 00:31:56.806 17:25:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:31:57.374 [2024-07-23 17:25:52.691326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:31:57.374 [2024-07-23 17:25:52.691374] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:57.374 [2024-07-23 17:25:52.691395] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20a8680 00:31:57.374 [2024-07-23 17:25:52.691407] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:57.374 [2024-07-23 17:25:52.692981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:57.374 [2024-07-23 17:25:52.693009] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:31:57.374 BaseBdev2 00:31:57.374 17:25:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:31:57.632 spare_malloc 00:31:57.632 17:25:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:31:58.230 spare_delay 00:31:58.230 17:25:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:58.488 [2024-07-23 17:25:53.715767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:58.488 [2024-07-23 17:25:53.715814] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:58.488 [2024-07-23 17:25:53.715833] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ac2a0 00:31:58.488 [2024-07-23 17:25:53.715845] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:58.488 [2024-07-23 17:25:53.717399] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:58.488 [2024-07-23 17:25:53.717427] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:58.488 spare 00:31:58.488 17:25:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:31:59.056 [2024-07-23 17:25:54.217110] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:59.056 [2024-07-23 17:25:54.218446] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:59.056 [2024-07-23 17:25:54.218609] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20abea0 00:31:59.056 [2024-07-23 17:25:54.218623] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:31:59.056 [2024-07-23 17:25:54.218821] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ae010 00:31:59.056 [2024-07-23 17:25:54.218974] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20abea0 00:31:59.056 [2024-07-23 17:25:54.218985] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20abea0 00:31:59.056 [2024-07-23 17:25:54.219090] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:59.056 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:59.315 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:59.315 "name": "raid_bdev1", 00:31:59.315 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:31:59.315 "strip_size_kb": 0, 00:31:59.315 "state": "online", 00:31:59.315 "raid_level": "raid1", 00:31:59.315 "superblock": true, 00:31:59.315 "num_base_bdevs": 2, 00:31:59.315 "num_base_bdevs_discovered": 2, 00:31:59.315 "num_base_bdevs_operational": 2, 00:31:59.315 "base_bdevs_list": [ 00:31:59.315 { 00:31:59.315 "name": "BaseBdev1", 00:31:59.315 "uuid": "18e3390c-1736-56fd-b115-d0f095be9152", 00:31:59.315 "is_configured": true, 00:31:59.315 "data_offset": 256, 00:31:59.315 "data_size": 7936 00:31:59.315 }, 00:31:59.315 { 00:31:59.315 "name": "BaseBdev2", 00:31:59.315 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:31:59.315 "is_configured": true, 00:31:59.315 "data_offset": 256, 00:31:59.315 "data_size": 7936 00:31:59.315 } 00:31:59.315 ] 00:31:59.315 }' 00:31:59.315 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:59.315 17:25:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:31:59.882 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:31:59.882 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:32:00.141 [2024-07-23 17:25:55.328345] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:00.141 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:32:00.141 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:00.141 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:32:00.400 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:32:00.659 [2024-07-23 17:25:55.829454] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200fce0 00:32:00.659 /dev/nbd0 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:00.659 1+0 records in 00:32:00.659 1+0 records out 00:32:00.659 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274283 s, 14.9 MB/s 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:32:00.659 17:25:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:32:01.594 7936+0 records in 00:32:01.594 7936+0 records out 00:32:01.594 32505856 bytes (33 MB, 31 MiB) copied, 0.755845 s, 43.0 MB/s 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:32:01.594 [2024-07-23 17:25:56.848999] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:32:01.594 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:32:01.595 17:25:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:32:01.853 [2024-07-23 17:25:57.089691] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:01.853 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:02.112 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:02.112 "name": "raid_bdev1", 00:32:02.112 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:02.112 "strip_size_kb": 0, 00:32:02.112 "state": "online", 00:32:02.112 "raid_level": "raid1", 00:32:02.112 "superblock": true, 00:32:02.112 "num_base_bdevs": 2, 00:32:02.112 "num_base_bdevs_discovered": 1, 00:32:02.112 "num_base_bdevs_operational": 1, 00:32:02.112 "base_bdevs_list": [ 00:32:02.112 { 00:32:02.112 "name": null, 00:32:02.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:02.112 "is_configured": false, 00:32:02.112 "data_offset": 256, 00:32:02.112 "data_size": 7936 00:32:02.112 }, 00:32:02.112 { 00:32:02.112 "name": "BaseBdev2", 00:32:02.112 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:02.112 "is_configured": true, 00:32:02.112 "data_offset": 256, 00:32:02.112 "data_size": 7936 00:32:02.112 } 00:32:02.112 ] 00:32:02.112 }' 00:32:02.112 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:02.112 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:02.677 17:25:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:32:02.935 [2024-07-23 17:25:58.180596] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:32:02.935 [2024-07-23 17:25:58.185494] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20acbe0 00:32:02.935 [2024-07-23 17:25:58.187772] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:32:02.935 17:25:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:03.871 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:04.130 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:04.130 "name": "raid_bdev1", 00:32:04.130 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:04.130 "strip_size_kb": 0, 00:32:04.130 "state": "online", 00:32:04.130 "raid_level": "raid1", 00:32:04.130 "superblock": true, 00:32:04.130 "num_base_bdevs": 2, 00:32:04.130 "num_base_bdevs_discovered": 2, 00:32:04.130 "num_base_bdevs_operational": 2, 00:32:04.130 "process": { 00:32:04.130 "type": "rebuild", 00:32:04.130 "target": "spare", 00:32:04.130 "progress": { 00:32:04.130 "blocks": 3072, 00:32:04.130 "percent": 38 00:32:04.130 } 00:32:04.130 }, 00:32:04.130 "base_bdevs_list": [ 00:32:04.130 { 00:32:04.130 "name": "spare", 00:32:04.130 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:04.130 "is_configured": true, 00:32:04.130 "data_offset": 256, 00:32:04.130 "data_size": 7936 00:32:04.130 }, 00:32:04.130 { 00:32:04.130 "name": "BaseBdev2", 00:32:04.130 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:04.130 "is_configured": true, 00:32:04.130 "data_offset": 256, 00:32:04.130 "data_size": 7936 00:32:04.130 } 00:32:04.130 ] 00:32:04.130 }' 00:32:04.130 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:04.389 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:32:04.389 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:04.389 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:32:04.389 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:32:04.648 [2024-07-23 17:25:59.826150] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:04.648 [2024-07-23 17:25:59.901094] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:32:04.648 [2024-07-23 17:25:59.901135] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:04.648 [2024-07-23 17:25:59.901156] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:04.648 [2024-07-23 17:25:59.901165] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:04.648 17:25:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:04.907 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:04.907 "name": "raid_bdev1", 00:32:04.907 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:04.907 "strip_size_kb": 0, 00:32:04.907 "state": "online", 00:32:04.907 "raid_level": "raid1", 00:32:04.907 "superblock": true, 00:32:04.907 "num_base_bdevs": 2, 00:32:04.907 "num_base_bdevs_discovered": 1, 00:32:04.907 "num_base_bdevs_operational": 1, 00:32:04.907 "base_bdevs_list": [ 00:32:04.907 { 00:32:04.907 "name": null, 00:32:04.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:04.907 "is_configured": false, 00:32:04.907 "data_offset": 256, 00:32:04.907 "data_size": 7936 00:32:04.907 }, 00:32:04.907 { 00:32:04.907 "name": "BaseBdev2", 00:32:04.907 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:04.907 "is_configured": true, 00:32:04.907 "data_offset": 256, 00:32:04.907 "data_size": 7936 00:32:04.907 } 00:32:04.907 ] 00:32:04.907 }' 00:32:04.907 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:04.907 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:05.475 17:26:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:05.733 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:05.733 "name": "raid_bdev1", 00:32:05.733 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:05.733 "strip_size_kb": 0, 00:32:05.733 "state": "online", 00:32:05.733 "raid_level": "raid1", 00:32:05.733 "superblock": true, 00:32:05.733 "num_base_bdevs": 2, 00:32:05.733 "num_base_bdevs_discovered": 1, 00:32:05.733 "num_base_bdevs_operational": 1, 00:32:05.733 "base_bdevs_list": [ 00:32:05.733 { 00:32:05.733 "name": null, 00:32:05.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:05.733 "is_configured": false, 00:32:05.733 "data_offset": 256, 00:32:05.733 "data_size": 7936 00:32:05.733 }, 00:32:05.733 { 00:32:05.733 "name": "BaseBdev2", 00:32:05.733 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:05.733 "is_configured": true, 00:32:05.733 "data_offset": 256, 00:32:05.733 "data_size": 7936 00:32:05.733 } 00:32:05.733 ] 00:32:05.733 }' 00:32:05.733 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:05.733 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:32:05.733 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:05.733 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:32:05.733 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:32:05.992 [2024-07-23 17:26:01.370055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:32:05.992 [2024-07-23 17:26:01.374945] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200fce0 00:32:05.992 [2024-07-23 17:26:01.376377] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:32:05.992 17:26:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:07.368 "name": "raid_bdev1", 00:32:07.368 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:07.368 "strip_size_kb": 0, 00:32:07.368 "state": "online", 00:32:07.368 "raid_level": "raid1", 00:32:07.368 "superblock": true, 00:32:07.368 "num_base_bdevs": 2, 00:32:07.368 "num_base_bdevs_discovered": 2, 00:32:07.368 "num_base_bdevs_operational": 2, 00:32:07.368 "process": { 00:32:07.368 "type": "rebuild", 00:32:07.368 "target": "spare", 00:32:07.368 "progress": { 00:32:07.368 "blocks": 3072, 00:32:07.368 "percent": 38 00:32:07.368 } 00:32:07.368 }, 00:32:07.368 "base_bdevs_list": [ 00:32:07.368 { 00:32:07.368 "name": "spare", 00:32:07.368 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:07.368 "is_configured": true, 00:32:07.368 "data_offset": 256, 00:32:07.368 "data_size": 7936 00:32:07.368 }, 00:32:07.368 { 00:32:07.368 "name": "BaseBdev2", 00:32:07.368 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:07.368 "is_configured": true, 00:32:07.368 "data_offset": 256, 00:32:07.368 "data_size": 7936 00:32:07.368 } 00:32:07.368 ] 00:32:07.368 }' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:32:07.368 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1062 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:07.368 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:07.626 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:07.627 "name": "raid_bdev1", 00:32:07.627 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:07.627 "strip_size_kb": 0, 00:32:07.627 "state": "online", 00:32:07.627 "raid_level": "raid1", 00:32:07.627 "superblock": true, 00:32:07.627 "num_base_bdevs": 2, 00:32:07.627 "num_base_bdevs_discovered": 2, 00:32:07.627 "num_base_bdevs_operational": 2, 00:32:07.627 "process": { 00:32:07.627 "type": "rebuild", 00:32:07.627 "target": "spare", 00:32:07.627 "progress": { 00:32:07.627 "blocks": 3840, 00:32:07.627 "percent": 48 00:32:07.627 } 00:32:07.627 }, 00:32:07.627 "base_bdevs_list": [ 00:32:07.627 { 00:32:07.627 "name": "spare", 00:32:07.627 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:07.627 "is_configured": true, 00:32:07.627 "data_offset": 256, 00:32:07.627 "data_size": 7936 00:32:07.627 }, 00:32:07.627 { 00:32:07.627 "name": "BaseBdev2", 00:32:07.627 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:07.627 "is_configured": true, 00:32:07.627 "data_offset": 256, 00:32:07.627 "data_size": 7936 00:32:07.627 } 00:32:07.627 ] 00:32:07.627 }' 00:32:07.627 17:26:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:07.627 17:26:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:32:07.627 17:26:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:07.885 17:26:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:32:07.885 17:26:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:08.819 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:09.078 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:09.078 "name": "raid_bdev1", 00:32:09.078 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:09.078 "strip_size_kb": 0, 00:32:09.078 "state": "online", 00:32:09.078 "raid_level": "raid1", 00:32:09.078 "superblock": true, 00:32:09.078 "num_base_bdevs": 2, 00:32:09.078 "num_base_bdevs_discovered": 2, 00:32:09.078 "num_base_bdevs_operational": 2, 00:32:09.078 "process": { 00:32:09.078 "type": "rebuild", 00:32:09.078 "target": "spare", 00:32:09.078 "progress": { 00:32:09.078 "blocks": 7424, 00:32:09.078 "percent": 93 00:32:09.078 } 00:32:09.078 }, 00:32:09.078 "base_bdevs_list": [ 00:32:09.078 { 00:32:09.078 "name": "spare", 00:32:09.078 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:09.078 "is_configured": true, 00:32:09.078 "data_offset": 256, 00:32:09.078 "data_size": 7936 00:32:09.078 }, 00:32:09.078 { 00:32:09.078 "name": "BaseBdev2", 00:32:09.078 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:09.078 "is_configured": true, 00:32:09.078 "data_offset": 256, 00:32:09.078 "data_size": 7936 00:32:09.078 } 00:32:09.078 ] 00:32:09.078 }' 00:32:09.078 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:09.078 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:32:09.078 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:09.078 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:32:09.078 17:26:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:32:09.336 [2024-07-23 17:26:04.500544] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:32:09.336 [2024-07-23 17:26:04.500600] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:32:09.336 [2024-07-23 17:26:04.500679] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:10.272 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:10.634 "name": "raid_bdev1", 00:32:10.634 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:10.634 "strip_size_kb": 0, 00:32:10.634 "state": "online", 00:32:10.634 "raid_level": "raid1", 00:32:10.634 "superblock": true, 00:32:10.634 "num_base_bdevs": 2, 00:32:10.634 "num_base_bdevs_discovered": 2, 00:32:10.634 "num_base_bdevs_operational": 2, 00:32:10.634 "base_bdevs_list": [ 00:32:10.634 { 00:32:10.634 "name": "spare", 00:32:10.634 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:10.634 "is_configured": true, 00:32:10.634 "data_offset": 256, 00:32:10.634 "data_size": 7936 00:32:10.634 }, 00:32:10.634 { 00:32:10.634 "name": "BaseBdev2", 00:32:10.634 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:10.634 "is_configured": true, 00:32:10.634 "data_offset": 256, 00:32:10.634 "data_size": 7936 00:32:10.634 } 00:32:10.634 ] 00:32:10.634 }' 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:10.634 17:26:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:10.634 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:10.634 "name": "raid_bdev1", 00:32:10.634 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:10.634 "strip_size_kb": 0, 00:32:10.634 "state": "online", 00:32:10.634 "raid_level": "raid1", 00:32:10.634 "superblock": true, 00:32:10.634 "num_base_bdevs": 2, 00:32:10.634 "num_base_bdevs_discovered": 2, 00:32:10.634 "num_base_bdevs_operational": 2, 00:32:10.634 "base_bdevs_list": [ 00:32:10.634 { 00:32:10.634 "name": "spare", 00:32:10.634 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:10.634 "is_configured": true, 00:32:10.634 "data_offset": 256, 00:32:10.634 "data_size": 7936 00:32:10.634 }, 00:32:10.634 { 00:32:10.634 "name": "BaseBdev2", 00:32:10.634 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:10.634 "is_configured": true, 00:32:10.634 "data_offset": 256, 00:32:10.634 "data_size": 7936 00:32:10.634 } 00:32:10.634 ] 00:32:10.634 }' 00:32:10.634 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:10.894 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:11.153 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:11.153 "name": "raid_bdev1", 00:32:11.153 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:11.153 "strip_size_kb": 0, 00:32:11.153 "state": "online", 00:32:11.153 "raid_level": "raid1", 00:32:11.153 "superblock": true, 00:32:11.153 "num_base_bdevs": 2, 00:32:11.153 "num_base_bdevs_discovered": 2, 00:32:11.153 "num_base_bdevs_operational": 2, 00:32:11.153 "base_bdevs_list": [ 00:32:11.153 { 00:32:11.153 "name": "spare", 00:32:11.153 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:11.153 "is_configured": true, 00:32:11.153 "data_offset": 256, 00:32:11.153 "data_size": 7936 00:32:11.153 }, 00:32:11.153 { 00:32:11.153 "name": "BaseBdev2", 00:32:11.153 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:11.153 "is_configured": true, 00:32:11.153 "data_offset": 256, 00:32:11.153 "data_size": 7936 00:32:11.153 } 00:32:11.153 ] 00:32:11.153 }' 00:32:11.153 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:11.153 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:11.720 17:26:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:11.978 [2024-07-23 17:26:07.192951] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:11.978 [2024-07-23 17:26:07.192978] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:11.978 [2024-07-23 17:26:07.193033] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:11.978 [2024-07-23 17:26:07.193088] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:11.978 [2024-07-23 17:26:07.193100] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20abea0 name raid_bdev1, state offline 00:32:11.978 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:11.978 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:32:12.349 /dev/nbd0 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:12.349 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:12.350 1+0 records in 00:32:12.350 1+0 records out 00:32:12.350 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247441 s, 16.6 MB/s 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:12.350 17:26:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:32:12.667 /dev/nbd1 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:32:12.667 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:12.668 1+0 records in 00:32:12.668 1+0 records out 00:32:12.668 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311159 s, 13.2 MB/s 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:12.668 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.926 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:13.185 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:32:13.444 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:32:13.702 17:26:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:32:13.961 [2024-07-23 17:26:09.153559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:32:13.961 [2024-07-23 17:26:09.153603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:13.961 [2024-07-23 17:26:09.153623] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20acb80 00:32:13.961 [2024-07-23 17:26:09.153635] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:13.961 [2024-07-23 17:26:09.155232] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:13.961 [2024-07-23 17:26:09.155262] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:32:13.961 [2024-07-23 17:26:09.155339] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:32:13.961 [2024-07-23 17:26:09.155363] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:32:13.961 [2024-07-23 17:26:09.155459] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:13.961 spare 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:13.961 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:13.961 [2024-07-23 17:26:09.255773] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x200ece0 00:32:13.961 [2024-07-23 17:26:09.255792] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:13.961 [2024-07-23 17:26:09.255992] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200fbb0 00:32:13.961 [2024-07-23 17:26:09.256144] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x200ece0 00:32:13.961 [2024-07-23 17:26:09.256154] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x200ece0 00:32:13.961 [2024-07-23 17:26:09.256261] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:14.220 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:14.220 "name": "raid_bdev1", 00:32:14.220 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:14.220 "strip_size_kb": 0, 00:32:14.220 "state": "online", 00:32:14.220 "raid_level": "raid1", 00:32:14.220 "superblock": true, 00:32:14.220 "num_base_bdevs": 2, 00:32:14.220 "num_base_bdevs_discovered": 2, 00:32:14.220 "num_base_bdevs_operational": 2, 00:32:14.220 "base_bdevs_list": [ 00:32:14.220 { 00:32:14.220 "name": "spare", 00:32:14.220 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:14.220 "is_configured": true, 00:32:14.220 "data_offset": 256, 00:32:14.220 "data_size": 7936 00:32:14.220 }, 00:32:14.220 { 00:32:14.220 "name": "BaseBdev2", 00:32:14.220 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:14.220 "is_configured": true, 00:32:14.220 "data_offset": 256, 00:32:14.220 "data_size": 7936 00:32:14.220 } 00:32:14.220 ] 00:32:14.220 }' 00:32:14.220 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:14.220 17:26:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:14.787 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:15.046 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:15.046 "name": "raid_bdev1", 00:32:15.046 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:15.046 "strip_size_kb": 0, 00:32:15.046 "state": "online", 00:32:15.046 "raid_level": "raid1", 00:32:15.046 "superblock": true, 00:32:15.046 "num_base_bdevs": 2, 00:32:15.046 "num_base_bdevs_discovered": 2, 00:32:15.046 "num_base_bdevs_operational": 2, 00:32:15.046 "base_bdevs_list": [ 00:32:15.046 { 00:32:15.046 "name": "spare", 00:32:15.046 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:15.046 "is_configured": true, 00:32:15.046 "data_offset": 256, 00:32:15.046 "data_size": 7936 00:32:15.046 }, 00:32:15.046 { 00:32:15.046 "name": "BaseBdev2", 00:32:15.046 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:15.046 "is_configured": true, 00:32:15.046 "data_offset": 256, 00:32:15.046 "data_size": 7936 00:32:15.046 } 00:32:15.046 ] 00:32:15.046 }' 00:32:15.046 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:15.046 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:32:15.046 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:15.046 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:32:15.047 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:15.047 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:32:15.305 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:32:15.305 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:32:15.564 [2024-07-23 17:26:10.894308] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:15.564 17:26:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:15.823 17:26:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:15.823 "name": "raid_bdev1", 00:32:15.823 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:15.823 "strip_size_kb": 0, 00:32:15.823 "state": "online", 00:32:15.823 "raid_level": "raid1", 00:32:15.823 "superblock": true, 00:32:15.823 "num_base_bdevs": 2, 00:32:15.823 "num_base_bdevs_discovered": 1, 00:32:15.823 "num_base_bdevs_operational": 1, 00:32:15.823 "base_bdevs_list": [ 00:32:15.823 { 00:32:15.823 "name": null, 00:32:15.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:15.823 "is_configured": false, 00:32:15.823 "data_offset": 256, 00:32:15.823 "data_size": 7936 00:32:15.823 }, 00:32:15.823 { 00:32:15.823 "name": "BaseBdev2", 00:32:15.823 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:15.823 "is_configured": true, 00:32:15.823 "data_offset": 256, 00:32:15.823 "data_size": 7936 00:32:15.823 } 00:32:15.823 ] 00:32:15.823 }' 00:32:15.823 17:26:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:15.823 17:26:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:16.391 17:26:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:32:16.650 [2024-07-23 17:26:11.977201] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:32:16.650 [2024-07-23 17:26:11.977355] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:32:16.650 [2024-07-23 17:26:11.977371] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:32:16.650 [2024-07-23 17:26:11.977398] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:32:16.650 [2024-07-23 17:26:11.982802] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21c1a00 00:32:16.650 [2024-07-23 17:26:11.984170] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:32:16.650 17:26:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:32:17.587 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:17.587 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:17.587 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:17.587 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:17.587 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:17.846 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:17.846 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:17.846 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:17.846 "name": "raid_bdev1", 00:32:17.846 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:17.846 "strip_size_kb": 0, 00:32:17.846 "state": "online", 00:32:17.846 "raid_level": "raid1", 00:32:17.846 "superblock": true, 00:32:17.846 "num_base_bdevs": 2, 00:32:17.846 "num_base_bdevs_discovered": 2, 00:32:17.846 "num_base_bdevs_operational": 2, 00:32:17.846 "process": { 00:32:17.846 "type": "rebuild", 00:32:17.846 "target": "spare", 00:32:17.846 "progress": { 00:32:17.846 "blocks": 3072, 00:32:17.846 "percent": 38 00:32:17.846 } 00:32:17.846 }, 00:32:17.846 "base_bdevs_list": [ 00:32:17.846 { 00:32:17.846 "name": "spare", 00:32:17.846 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:17.846 "is_configured": true, 00:32:17.846 "data_offset": 256, 00:32:17.846 "data_size": 7936 00:32:17.846 }, 00:32:17.846 { 00:32:17.846 "name": "BaseBdev2", 00:32:17.846 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:17.846 "is_configured": true, 00:32:17.846 "data_offset": 256, 00:32:17.846 "data_size": 7936 00:32:17.846 } 00:32:17.846 ] 00:32:17.846 }' 00:32:17.846 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:18.106 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:32:18.107 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:18.107 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:32:18.107 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:32:18.366 [2024-07-23 17:26:13.571300] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:18.366 [2024-07-23 17:26:13.596757] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:32:18.366 [2024-07-23 17:26:13.596797] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:18.366 [2024-07-23 17:26:13.596812] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:18.366 [2024-07-23 17:26:13.596821] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:18.366 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:18.625 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:18.625 "name": "raid_bdev1", 00:32:18.625 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:18.625 "strip_size_kb": 0, 00:32:18.625 "state": "online", 00:32:18.625 "raid_level": "raid1", 00:32:18.625 "superblock": true, 00:32:18.625 "num_base_bdevs": 2, 00:32:18.625 "num_base_bdevs_discovered": 1, 00:32:18.625 "num_base_bdevs_operational": 1, 00:32:18.625 "base_bdevs_list": [ 00:32:18.625 { 00:32:18.625 "name": null, 00:32:18.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:18.625 "is_configured": false, 00:32:18.625 "data_offset": 256, 00:32:18.625 "data_size": 7936 00:32:18.625 }, 00:32:18.625 { 00:32:18.625 "name": "BaseBdev2", 00:32:18.625 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:18.625 "is_configured": true, 00:32:18.625 "data_offset": 256, 00:32:18.625 "data_size": 7936 00:32:18.625 } 00:32:18.625 ] 00:32:18.625 }' 00:32:18.625 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:18.625 17:26:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:19.192 17:26:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:32:19.451 [2024-07-23 17:26:14.704858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:32:19.451 [2024-07-23 17:26:14.704917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:19.451 [2024-07-23 17:26:14.704938] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20adad0 00:32:19.451 [2024-07-23 17:26:14.704951] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:19.451 [2024-07-23 17:26:14.705315] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:19.451 [2024-07-23 17:26:14.705332] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:32:19.451 [2024-07-23 17:26:14.705410] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:32:19.451 [2024-07-23 17:26:14.705421] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:32:19.451 [2024-07-23 17:26:14.705432] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:32:19.451 [2024-07-23 17:26:14.705450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:32:19.451 [2024-07-23 17:26:14.710244] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21c08a0 00:32:19.451 spare 00:32:19.451 [2024-07-23 17:26:14.711562] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:32:19.451 17:26:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:20.387 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:20.646 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:20.646 "name": "raid_bdev1", 00:32:20.646 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:20.646 "strip_size_kb": 0, 00:32:20.646 "state": "online", 00:32:20.646 "raid_level": "raid1", 00:32:20.646 "superblock": true, 00:32:20.646 "num_base_bdevs": 2, 00:32:20.646 "num_base_bdevs_discovered": 2, 00:32:20.646 "num_base_bdevs_operational": 2, 00:32:20.646 "process": { 00:32:20.646 "type": "rebuild", 00:32:20.646 "target": "spare", 00:32:20.646 "progress": { 00:32:20.646 "blocks": 3072, 00:32:20.646 "percent": 38 00:32:20.646 } 00:32:20.646 }, 00:32:20.646 "base_bdevs_list": [ 00:32:20.646 { 00:32:20.646 "name": "spare", 00:32:20.646 "uuid": "df685c05-9114-5413-89b8-8c5b7330aed5", 00:32:20.646 "is_configured": true, 00:32:20.646 "data_offset": 256, 00:32:20.646 "data_size": 7936 00:32:20.646 }, 00:32:20.646 { 00:32:20.646 "name": "BaseBdev2", 00:32:20.646 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:20.646 "is_configured": true, 00:32:20.646 "data_offset": 256, 00:32:20.646 "data_size": 7936 00:32:20.646 } 00:32:20.646 ] 00:32:20.646 }' 00:32:20.646 17:26:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:20.646 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:32:20.646 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:20.905 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:32:20.905 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:32:20.905 [2024-07-23 17:26:16.299256] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:20.905 [2024-07-23 17:26:16.324275] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:32:20.905 [2024-07-23 17:26:16.324317] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:20.905 [2024-07-23 17:26:16.324332] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:32:20.905 [2024-07-23 17:26:16.324341] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:21.164 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:21.423 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:21.423 "name": "raid_bdev1", 00:32:21.423 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:21.423 "strip_size_kb": 0, 00:32:21.423 "state": "online", 00:32:21.423 "raid_level": "raid1", 00:32:21.423 "superblock": true, 00:32:21.423 "num_base_bdevs": 2, 00:32:21.423 "num_base_bdevs_discovered": 1, 00:32:21.423 "num_base_bdevs_operational": 1, 00:32:21.423 "base_bdevs_list": [ 00:32:21.423 { 00:32:21.423 "name": null, 00:32:21.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:21.423 "is_configured": false, 00:32:21.423 "data_offset": 256, 00:32:21.423 "data_size": 7936 00:32:21.423 }, 00:32:21.423 { 00:32:21.423 "name": "BaseBdev2", 00:32:21.423 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:21.423 "is_configured": true, 00:32:21.423 "data_offset": 256, 00:32:21.423 "data_size": 7936 00:32:21.423 } 00:32:21.423 ] 00:32:21.423 }' 00:32:21.423 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:21.423 17:26:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:21.990 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:22.248 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:22.248 "name": "raid_bdev1", 00:32:22.248 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:22.248 "strip_size_kb": 0, 00:32:22.248 "state": "online", 00:32:22.248 "raid_level": "raid1", 00:32:22.248 "superblock": true, 00:32:22.248 "num_base_bdevs": 2, 00:32:22.248 "num_base_bdevs_discovered": 1, 00:32:22.248 "num_base_bdevs_operational": 1, 00:32:22.248 "base_bdevs_list": [ 00:32:22.248 { 00:32:22.248 "name": null, 00:32:22.248 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:22.248 "is_configured": false, 00:32:22.248 "data_offset": 256, 00:32:22.248 "data_size": 7936 00:32:22.248 }, 00:32:22.248 { 00:32:22.248 "name": "BaseBdev2", 00:32:22.248 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:22.248 "is_configured": true, 00:32:22.248 "data_offset": 256, 00:32:22.248 "data_size": 7936 00:32:22.248 } 00:32:22.248 ] 00:32:22.248 }' 00:32:22.248 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:22.248 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:32:22.248 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:22.249 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:32:22.249 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:32:22.507 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:32:22.765 [2024-07-23 17:26:17.977832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:32:22.765 [2024-07-23 17:26:17.977879] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:22.765 [2024-07-23 17:26:17.977904] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21be020 00:32:22.765 [2024-07-23 17:26:17.977918] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:22.765 [2024-07-23 17:26:17.978258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:22.765 [2024-07-23 17:26:17.978275] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:32:22.765 [2024-07-23 17:26:17.978337] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:32:22.765 [2024-07-23 17:26:17.978349] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:32:22.766 [2024-07-23 17:26:17.978359] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:32:22.766 BaseBdev1 00:32:22.766 17:26:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:32:23.698 17:26:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:23.698 17:26:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:23.698 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:23.956 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:23.957 "name": "raid_bdev1", 00:32:23.957 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:23.957 "strip_size_kb": 0, 00:32:23.957 "state": "online", 00:32:23.957 "raid_level": "raid1", 00:32:23.957 "superblock": true, 00:32:23.957 "num_base_bdevs": 2, 00:32:23.957 "num_base_bdevs_discovered": 1, 00:32:23.957 "num_base_bdevs_operational": 1, 00:32:23.957 "base_bdevs_list": [ 00:32:23.957 { 00:32:23.957 "name": null, 00:32:23.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:23.957 "is_configured": false, 00:32:23.957 "data_offset": 256, 00:32:23.957 "data_size": 7936 00:32:23.957 }, 00:32:23.957 { 00:32:23.957 "name": "BaseBdev2", 00:32:23.957 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:23.957 "is_configured": true, 00:32:23.957 "data_offset": 256, 00:32:23.957 "data_size": 7936 00:32:23.957 } 00:32:23.957 ] 00:32:23.957 }' 00:32:23.957 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:23.957 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:24.524 17:26:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:24.783 "name": "raid_bdev1", 00:32:24.783 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:24.783 "strip_size_kb": 0, 00:32:24.783 "state": "online", 00:32:24.783 "raid_level": "raid1", 00:32:24.783 "superblock": true, 00:32:24.783 "num_base_bdevs": 2, 00:32:24.783 "num_base_bdevs_discovered": 1, 00:32:24.783 "num_base_bdevs_operational": 1, 00:32:24.783 "base_bdevs_list": [ 00:32:24.783 { 00:32:24.783 "name": null, 00:32:24.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:24.783 "is_configured": false, 00:32:24.783 "data_offset": 256, 00:32:24.783 "data_size": 7936 00:32:24.783 }, 00:32:24.783 { 00:32:24.783 "name": "BaseBdev2", 00:32:24.783 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:24.783 "is_configured": true, 00:32:24.783 "data_offset": 256, 00:32:24.783 "data_size": 7936 00:32:24.783 } 00:32:24.783 ] 00:32:24.783 }' 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:32:24.783 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:32:25.350 [2024-07-23 17:26:20.592783] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:25.350 [2024-07-23 17:26:20.592909] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:32:25.350 [2024-07-23 17:26:20.592924] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:32:25.350 request: 00:32:25.350 { 00:32:25.350 "base_bdev": "BaseBdev1", 00:32:25.350 "raid_bdev": "raid_bdev1", 00:32:25.350 "method": "bdev_raid_add_base_bdev", 00:32:25.350 "req_id": 1 00:32:25.350 } 00:32:25.350 Got JSON-RPC error response 00:32:25.350 response: 00:32:25.350 { 00:32:25.350 "code": -22, 00:32:25.350 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:32:25.350 } 00:32:25.350 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:32:25.350 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:32:25.350 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:32:25.350 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:32:25.350 17:26:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:26.286 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:26.545 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:26.545 "name": "raid_bdev1", 00:32:26.545 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:26.545 "strip_size_kb": 0, 00:32:26.545 "state": "online", 00:32:26.545 "raid_level": "raid1", 00:32:26.545 "superblock": true, 00:32:26.545 "num_base_bdevs": 2, 00:32:26.545 "num_base_bdevs_discovered": 1, 00:32:26.545 "num_base_bdevs_operational": 1, 00:32:26.545 "base_bdevs_list": [ 00:32:26.545 { 00:32:26.545 "name": null, 00:32:26.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:26.545 "is_configured": false, 00:32:26.545 "data_offset": 256, 00:32:26.545 "data_size": 7936 00:32:26.546 }, 00:32:26.546 { 00:32:26.546 "name": "BaseBdev2", 00:32:26.546 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:26.546 "is_configured": true, 00:32:26.546 "data_offset": 256, 00:32:26.546 "data_size": 7936 00:32:26.546 } 00:32:26.546 ] 00:32:26.546 }' 00:32:26.546 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:26.546 17:26:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:27.113 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:27.372 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:32:27.372 "name": "raid_bdev1", 00:32:27.372 "uuid": "31c35429-9ee1-41f4-b69e-751f585ccfe1", 00:32:27.372 "strip_size_kb": 0, 00:32:27.372 "state": "online", 00:32:27.372 "raid_level": "raid1", 00:32:27.372 "superblock": true, 00:32:27.372 "num_base_bdevs": 2, 00:32:27.372 "num_base_bdevs_discovered": 1, 00:32:27.372 "num_base_bdevs_operational": 1, 00:32:27.372 "base_bdevs_list": [ 00:32:27.372 { 00:32:27.372 "name": null, 00:32:27.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:27.372 "is_configured": false, 00:32:27.372 "data_offset": 256, 00:32:27.372 "data_size": 7936 00:32:27.372 }, 00:32:27.372 { 00:32:27.372 "name": "BaseBdev2", 00:32:27.372 "uuid": "af79de54-db1b-5970-a3a2-c31acf858121", 00:32:27.372 "is_configured": true, 00:32:27.372 "data_offset": 256, 00:32:27.372 "data_size": 7936 00:32:27.372 } 00:32:27.372 ] 00:32:27.372 }' 00:32:27.372 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:32:27.372 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:32:27.372 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 62812 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 62812 ']' 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 62812 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 62812 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 62812' 00:32:27.631 killing process with pid 62812 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 62812 00:32:27.631 Received shutdown signal, test time was about 60.000000 seconds 00:32:27.631 00:32:27.631 Latency(us) 00:32:27.631 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:27.631 =================================================================================================================== 00:32:27.631 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:32:27.631 [2024-07-23 17:26:22.864391] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:27.631 [2024-07-23 17:26:22.864476] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:27.631 [2024-07-23 17:26:22.864518] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:27.631 [2024-07-23 17:26:22.864529] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x200ece0 name raid_bdev1, state offline 00:32:27.631 17:26:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 62812 00:32:27.631 [2024-07-23 17:26:22.892004] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:27.891 17:26:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:32:27.891 00:32:27.891 real 0m33.167s 00:32:27.891 user 0m52.170s 00:32:27.891 sys 0m5.437s 00:32:27.891 17:26:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:27.891 17:26:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:32:27.891 ************************************ 00:32:27.891 END TEST raid_rebuild_test_sb_4k 00:32:27.891 ************************************ 00:32:27.891 17:26:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:32:27.891 17:26:23 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:32:27.891 17:26:23 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:32:27.891 17:26:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:27.891 17:26:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:27.891 17:26:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:27.891 ************************************ 00:32:27.891 START TEST raid_state_function_test_sb_md_separate 00:32:27.891 ************************************ 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=67502 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 67502' 00:32:27.891 Process raid pid: 67502 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 67502 /var/tmp/spdk-raid.sock 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 67502 ']' 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:27.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:27.891 17:26:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:27.891 [2024-07-23 17:26:23.262182] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:32:27.891 [2024-07-23 17:26:23.262254] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:28.149 [2024-07-23 17:26:23.396990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:28.149 [2024-07-23 17:26:23.450812] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:28.149 [2024-07-23 17:26:23.513995] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:28.149 [2024-07-23 17:26:23.514033] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:32:29.082 [2024-07-23 17:26:24.425280] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:29.082 [2024-07-23 17:26:24.425327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:29.082 [2024-07-23 17:26:24.425338] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:29.082 [2024-07-23 17:26:24.425350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:29.082 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:29.340 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:29.340 "name": "Existed_Raid", 00:32:29.340 "uuid": "cfafd195-ff89-4b5f-bb0f-c9cffe1600d5", 00:32:29.340 "strip_size_kb": 0, 00:32:29.340 "state": "configuring", 00:32:29.340 "raid_level": "raid1", 00:32:29.340 "superblock": true, 00:32:29.340 "num_base_bdevs": 2, 00:32:29.340 "num_base_bdevs_discovered": 0, 00:32:29.340 "num_base_bdevs_operational": 2, 00:32:29.340 "base_bdevs_list": [ 00:32:29.340 { 00:32:29.340 "name": "BaseBdev1", 00:32:29.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:29.340 "is_configured": false, 00:32:29.340 "data_offset": 0, 00:32:29.340 "data_size": 0 00:32:29.340 }, 00:32:29.340 { 00:32:29.340 "name": "BaseBdev2", 00:32:29.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:29.340 "is_configured": false, 00:32:29.340 "data_offset": 0, 00:32:29.340 "data_size": 0 00:32:29.340 } 00:32:29.340 ] 00:32:29.340 }' 00:32:29.340 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:29.340 17:26:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:29.908 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:30.166 [2024-07-23 17:26:25.524035] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:30.166 [2024-07-23 17:26:25.524068] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d2d3f0 name Existed_Raid, state configuring 00:32:30.166 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:32:30.424 [2024-07-23 17:26:25.708542] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:32:30.424 [2024-07-23 17:26:25.708573] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:32:30.424 [2024-07-23 17:26:25.708583] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:30.424 [2024-07-23 17:26:25.708601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:30.424 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:32:30.683 [2024-07-23 17:26:25.971759] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:30.683 BaseBdev1 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:30.683 17:26:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:32:30.942 [ 00:32:30.942 { 00:32:30.942 "name": "BaseBdev1", 00:32:30.942 "aliases": [ 00:32:30.942 "19c9f5a8-6061-4210-8917-8f71db8ae9ba" 00:32:30.942 ], 00:32:30.942 "product_name": "Malloc disk", 00:32:30.942 "block_size": 4096, 00:32:30.942 "num_blocks": 8192, 00:32:30.942 "uuid": "19c9f5a8-6061-4210-8917-8f71db8ae9ba", 00:32:30.942 "md_size": 32, 00:32:30.942 "md_interleave": false, 00:32:30.942 "dif_type": 0, 00:32:30.942 "assigned_rate_limits": { 00:32:30.942 "rw_ios_per_sec": 0, 00:32:30.942 "rw_mbytes_per_sec": 0, 00:32:30.942 "r_mbytes_per_sec": 0, 00:32:30.942 "w_mbytes_per_sec": 0 00:32:30.942 }, 00:32:30.942 "claimed": true, 00:32:30.942 "claim_type": "exclusive_write", 00:32:30.942 "zoned": false, 00:32:30.942 "supported_io_types": { 00:32:30.942 "read": true, 00:32:30.942 "write": true, 00:32:30.942 "unmap": true, 00:32:30.942 "flush": true, 00:32:30.942 "reset": true, 00:32:30.942 "nvme_admin": false, 00:32:30.942 "nvme_io": false, 00:32:30.942 "nvme_io_md": false, 00:32:30.942 "write_zeroes": true, 00:32:30.942 "zcopy": true, 00:32:30.942 "get_zone_info": false, 00:32:30.942 "zone_management": false, 00:32:30.942 "zone_append": false, 00:32:30.942 "compare": false, 00:32:30.942 "compare_and_write": false, 00:32:30.942 "abort": true, 00:32:30.942 "seek_hole": false, 00:32:30.942 "seek_data": false, 00:32:30.942 "copy": true, 00:32:30.942 "nvme_iov_md": false 00:32:30.942 }, 00:32:30.942 "memory_domains": [ 00:32:30.942 { 00:32:30.942 "dma_device_id": "system", 00:32:30.942 "dma_device_type": 1 00:32:30.942 }, 00:32:30.942 { 00:32:30.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:30.942 "dma_device_type": 2 00:32:30.942 } 00:32:30.942 ], 00:32:30.942 "driver_specific": {} 00:32:30.942 } 00:32:30.942 ] 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:30.942 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:31.201 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:31.201 "name": "Existed_Raid", 00:32:31.201 "uuid": "78cc3db7-42ee-42be-a7fc-14f6218e05ba", 00:32:31.201 "strip_size_kb": 0, 00:32:31.201 "state": "configuring", 00:32:31.201 "raid_level": "raid1", 00:32:31.201 "superblock": true, 00:32:31.201 "num_base_bdevs": 2, 00:32:31.201 "num_base_bdevs_discovered": 1, 00:32:31.201 "num_base_bdevs_operational": 2, 00:32:31.201 "base_bdevs_list": [ 00:32:31.201 { 00:32:31.201 "name": "BaseBdev1", 00:32:31.201 "uuid": "19c9f5a8-6061-4210-8917-8f71db8ae9ba", 00:32:31.201 "is_configured": true, 00:32:31.201 "data_offset": 256, 00:32:31.201 "data_size": 7936 00:32:31.201 }, 00:32:31.201 { 00:32:31.201 "name": "BaseBdev2", 00:32:31.201 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:31.201 "is_configured": false, 00:32:31.201 "data_offset": 0, 00:32:31.201 "data_size": 0 00:32:31.201 } 00:32:31.201 ] 00:32:31.201 }' 00:32:31.201 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:31.201 17:26:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:32.138 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:32:32.138 [2024-07-23 17:26:27.467752] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:32:32.138 [2024-07-23 17:26:27.467793] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d2cd20 name Existed_Raid, state configuring 00:32:32.138 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:32:32.397 [2024-07-23 17:26:27.708422] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:32.397 [2024-07-23 17:26:27.709810] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:32:32.397 [2024-07-23 17:26:27.709839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:32.397 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:32.656 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:32.656 "name": "Existed_Raid", 00:32:32.656 "uuid": "fcb24065-c7d0-432a-8b44-aff3e642f3af", 00:32:32.656 "strip_size_kb": 0, 00:32:32.656 "state": "configuring", 00:32:32.656 "raid_level": "raid1", 00:32:32.656 "superblock": true, 00:32:32.656 "num_base_bdevs": 2, 00:32:32.656 "num_base_bdevs_discovered": 1, 00:32:32.656 "num_base_bdevs_operational": 2, 00:32:32.656 "base_bdevs_list": [ 00:32:32.656 { 00:32:32.656 "name": "BaseBdev1", 00:32:32.656 "uuid": "19c9f5a8-6061-4210-8917-8f71db8ae9ba", 00:32:32.656 "is_configured": true, 00:32:32.656 "data_offset": 256, 00:32:32.656 "data_size": 7936 00:32:32.656 }, 00:32:32.656 { 00:32:32.656 "name": "BaseBdev2", 00:32:32.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:32.656 "is_configured": false, 00:32:32.656 "data_offset": 0, 00:32:32.656 "data_size": 0 00:32:32.656 } 00:32:32.656 ] 00:32:32.656 }' 00:32:32.656 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:32.656 17:26:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:33.223 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:32:33.482 [2024-07-23 17:26:28.876093] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:33.482 [2024-07-23 17:26:28.876258] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d2c100 00:32:33.482 [2024-07-23 17:26:28.876271] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:33.482 [2024-07-23 17:26:28.876335] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eca200 00:32:33.482 [2024-07-23 17:26:28.876436] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d2c100 00:32:33.482 [2024-07-23 17:26:28.876446] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d2c100 00:32:33.482 [2024-07-23 17:26:28.876509] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:33.482 BaseBdev2 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:33.482 17:26:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:32:33.740 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:32:34.000 [ 00:32:34.000 { 00:32:34.000 "name": "BaseBdev2", 00:32:34.000 "aliases": [ 00:32:34.000 "35adddb6-ce3c-4221-9210-bb284e94dc97" 00:32:34.000 ], 00:32:34.000 "product_name": "Malloc disk", 00:32:34.000 "block_size": 4096, 00:32:34.000 "num_blocks": 8192, 00:32:34.000 "uuid": "35adddb6-ce3c-4221-9210-bb284e94dc97", 00:32:34.000 "md_size": 32, 00:32:34.000 "md_interleave": false, 00:32:34.000 "dif_type": 0, 00:32:34.000 "assigned_rate_limits": { 00:32:34.000 "rw_ios_per_sec": 0, 00:32:34.000 "rw_mbytes_per_sec": 0, 00:32:34.000 "r_mbytes_per_sec": 0, 00:32:34.000 "w_mbytes_per_sec": 0 00:32:34.000 }, 00:32:34.000 "claimed": true, 00:32:34.000 "claim_type": "exclusive_write", 00:32:34.000 "zoned": false, 00:32:34.000 "supported_io_types": { 00:32:34.000 "read": true, 00:32:34.000 "write": true, 00:32:34.000 "unmap": true, 00:32:34.000 "flush": true, 00:32:34.000 "reset": true, 00:32:34.000 "nvme_admin": false, 00:32:34.000 "nvme_io": false, 00:32:34.000 "nvme_io_md": false, 00:32:34.000 "write_zeroes": true, 00:32:34.000 "zcopy": true, 00:32:34.000 "get_zone_info": false, 00:32:34.000 "zone_management": false, 00:32:34.000 "zone_append": false, 00:32:34.000 "compare": false, 00:32:34.000 "compare_and_write": false, 00:32:34.000 "abort": true, 00:32:34.000 "seek_hole": false, 00:32:34.000 "seek_data": false, 00:32:34.000 "copy": true, 00:32:34.000 "nvme_iov_md": false 00:32:34.000 }, 00:32:34.000 "memory_domains": [ 00:32:34.000 { 00:32:34.000 "dma_device_id": "system", 00:32:34.000 "dma_device_type": 1 00:32:34.000 }, 00:32:34.000 { 00:32:34.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:34.000 "dma_device_type": 2 00:32:34.000 } 00:32:34.000 ], 00:32:34.000 "driver_specific": {} 00:32:34.000 } 00:32:34.000 ] 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:34.000 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:34.259 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:34.259 "name": "Existed_Raid", 00:32:34.259 "uuid": "fcb24065-c7d0-432a-8b44-aff3e642f3af", 00:32:34.259 "strip_size_kb": 0, 00:32:34.259 "state": "online", 00:32:34.259 "raid_level": "raid1", 00:32:34.259 "superblock": true, 00:32:34.259 "num_base_bdevs": 2, 00:32:34.259 "num_base_bdevs_discovered": 2, 00:32:34.259 "num_base_bdevs_operational": 2, 00:32:34.259 "base_bdevs_list": [ 00:32:34.259 { 00:32:34.259 "name": "BaseBdev1", 00:32:34.259 "uuid": "19c9f5a8-6061-4210-8917-8f71db8ae9ba", 00:32:34.259 "is_configured": true, 00:32:34.259 "data_offset": 256, 00:32:34.259 "data_size": 7936 00:32:34.259 }, 00:32:34.259 { 00:32:34.259 "name": "BaseBdev2", 00:32:34.259 "uuid": "35adddb6-ce3c-4221-9210-bb284e94dc97", 00:32:34.259 "is_configured": true, 00:32:34.259 "data_offset": 256, 00:32:34.259 "data_size": 7936 00:32:34.259 } 00:32:34.259 ] 00:32:34.259 }' 00:32:34.259 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:34.259 17:26:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:35.250 [2024-07-23 17:26:30.524915] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:35.250 "name": "Existed_Raid", 00:32:35.250 "aliases": [ 00:32:35.250 "fcb24065-c7d0-432a-8b44-aff3e642f3af" 00:32:35.250 ], 00:32:35.250 "product_name": "Raid Volume", 00:32:35.250 "block_size": 4096, 00:32:35.250 "num_blocks": 7936, 00:32:35.250 "uuid": "fcb24065-c7d0-432a-8b44-aff3e642f3af", 00:32:35.250 "md_size": 32, 00:32:35.250 "md_interleave": false, 00:32:35.250 "dif_type": 0, 00:32:35.250 "assigned_rate_limits": { 00:32:35.250 "rw_ios_per_sec": 0, 00:32:35.250 "rw_mbytes_per_sec": 0, 00:32:35.250 "r_mbytes_per_sec": 0, 00:32:35.250 "w_mbytes_per_sec": 0 00:32:35.250 }, 00:32:35.250 "claimed": false, 00:32:35.250 "zoned": false, 00:32:35.250 "supported_io_types": { 00:32:35.250 "read": true, 00:32:35.250 "write": true, 00:32:35.250 "unmap": false, 00:32:35.250 "flush": false, 00:32:35.250 "reset": true, 00:32:35.250 "nvme_admin": false, 00:32:35.250 "nvme_io": false, 00:32:35.250 "nvme_io_md": false, 00:32:35.250 "write_zeroes": true, 00:32:35.250 "zcopy": false, 00:32:35.250 "get_zone_info": false, 00:32:35.250 "zone_management": false, 00:32:35.250 "zone_append": false, 00:32:35.250 "compare": false, 00:32:35.250 "compare_and_write": false, 00:32:35.250 "abort": false, 00:32:35.250 "seek_hole": false, 00:32:35.250 "seek_data": false, 00:32:35.250 "copy": false, 00:32:35.250 "nvme_iov_md": false 00:32:35.250 }, 00:32:35.250 "memory_domains": [ 00:32:35.250 { 00:32:35.250 "dma_device_id": "system", 00:32:35.250 "dma_device_type": 1 00:32:35.250 }, 00:32:35.250 { 00:32:35.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:35.250 "dma_device_type": 2 00:32:35.250 }, 00:32:35.250 { 00:32:35.250 "dma_device_id": "system", 00:32:35.250 "dma_device_type": 1 00:32:35.250 }, 00:32:35.250 { 00:32:35.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:35.250 "dma_device_type": 2 00:32:35.250 } 00:32:35.250 ], 00:32:35.250 "driver_specific": { 00:32:35.250 "raid": { 00:32:35.250 "uuid": "fcb24065-c7d0-432a-8b44-aff3e642f3af", 00:32:35.250 "strip_size_kb": 0, 00:32:35.250 "state": "online", 00:32:35.250 "raid_level": "raid1", 00:32:35.250 "superblock": true, 00:32:35.250 "num_base_bdevs": 2, 00:32:35.250 "num_base_bdevs_discovered": 2, 00:32:35.250 "num_base_bdevs_operational": 2, 00:32:35.250 "base_bdevs_list": [ 00:32:35.250 { 00:32:35.250 "name": "BaseBdev1", 00:32:35.250 "uuid": "19c9f5a8-6061-4210-8917-8f71db8ae9ba", 00:32:35.250 "is_configured": true, 00:32:35.250 "data_offset": 256, 00:32:35.250 "data_size": 7936 00:32:35.250 }, 00:32:35.250 { 00:32:35.250 "name": "BaseBdev2", 00:32:35.250 "uuid": "35adddb6-ce3c-4221-9210-bb284e94dc97", 00:32:35.250 "is_configured": true, 00:32:35.250 "data_offset": 256, 00:32:35.250 "data_size": 7936 00:32:35.250 } 00:32:35.250 ] 00:32:35.250 } 00:32:35.250 } 00:32:35.250 }' 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:32:35.250 BaseBdev2' 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:32:35.250 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:35.508 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:35.508 "name": "BaseBdev1", 00:32:35.508 "aliases": [ 00:32:35.508 "19c9f5a8-6061-4210-8917-8f71db8ae9ba" 00:32:35.508 ], 00:32:35.508 "product_name": "Malloc disk", 00:32:35.508 "block_size": 4096, 00:32:35.508 "num_blocks": 8192, 00:32:35.508 "uuid": "19c9f5a8-6061-4210-8917-8f71db8ae9ba", 00:32:35.508 "md_size": 32, 00:32:35.508 "md_interleave": false, 00:32:35.508 "dif_type": 0, 00:32:35.508 "assigned_rate_limits": { 00:32:35.508 "rw_ios_per_sec": 0, 00:32:35.508 "rw_mbytes_per_sec": 0, 00:32:35.508 "r_mbytes_per_sec": 0, 00:32:35.508 "w_mbytes_per_sec": 0 00:32:35.508 }, 00:32:35.508 "claimed": true, 00:32:35.508 "claim_type": "exclusive_write", 00:32:35.508 "zoned": false, 00:32:35.508 "supported_io_types": { 00:32:35.508 "read": true, 00:32:35.508 "write": true, 00:32:35.508 "unmap": true, 00:32:35.508 "flush": true, 00:32:35.508 "reset": true, 00:32:35.508 "nvme_admin": false, 00:32:35.508 "nvme_io": false, 00:32:35.508 "nvme_io_md": false, 00:32:35.508 "write_zeroes": true, 00:32:35.508 "zcopy": true, 00:32:35.508 "get_zone_info": false, 00:32:35.508 "zone_management": false, 00:32:35.508 "zone_append": false, 00:32:35.508 "compare": false, 00:32:35.508 "compare_and_write": false, 00:32:35.508 "abort": true, 00:32:35.508 "seek_hole": false, 00:32:35.508 "seek_data": false, 00:32:35.508 "copy": true, 00:32:35.508 "nvme_iov_md": false 00:32:35.508 }, 00:32:35.508 "memory_domains": [ 00:32:35.508 { 00:32:35.508 "dma_device_id": "system", 00:32:35.508 "dma_device_type": 1 00:32:35.508 }, 00:32:35.508 { 00:32:35.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:35.508 "dma_device_type": 2 00:32:35.508 } 00:32:35.508 ], 00:32:35.508 "driver_specific": {} 00:32:35.508 }' 00:32:35.508 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:35.508 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:35.508 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:32:35.508 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:35.767 17:26:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:35.767 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:32:35.767 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:35.767 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:35.767 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:32:36.025 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:36.025 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:36.025 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:32:36.025 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:36.025 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:32:36.025 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:36.284 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:36.284 "name": "BaseBdev2", 00:32:36.284 "aliases": [ 00:32:36.284 "35adddb6-ce3c-4221-9210-bb284e94dc97" 00:32:36.284 ], 00:32:36.284 "product_name": "Malloc disk", 00:32:36.284 "block_size": 4096, 00:32:36.284 "num_blocks": 8192, 00:32:36.284 "uuid": "35adddb6-ce3c-4221-9210-bb284e94dc97", 00:32:36.284 "md_size": 32, 00:32:36.284 "md_interleave": false, 00:32:36.284 "dif_type": 0, 00:32:36.284 "assigned_rate_limits": { 00:32:36.284 "rw_ios_per_sec": 0, 00:32:36.284 "rw_mbytes_per_sec": 0, 00:32:36.284 "r_mbytes_per_sec": 0, 00:32:36.284 "w_mbytes_per_sec": 0 00:32:36.284 }, 00:32:36.284 "claimed": true, 00:32:36.284 "claim_type": "exclusive_write", 00:32:36.284 "zoned": false, 00:32:36.284 "supported_io_types": { 00:32:36.284 "read": true, 00:32:36.284 "write": true, 00:32:36.284 "unmap": true, 00:32:36.284 "flush": true, 00:32:36.284 "reset": true, 00:32:36.284 "nvme_admin": false, 00:32:36.284 "nvme_io": false, 00:32:36.284 "nvme_io_md": false, 00:32:36.284 "write_zeroes": true, 00:32:36.284 "zcopy": true, 00:32:36.284 "get_zone_info": false, 00:32:36.284 "zone_management": false, 00:32:36.284 "zone_append": false, 00:32:36.284 "compare": false, 00:32:36.284 "compare_and_write": false, 00:32:36.284 "abort": true, 00:32:36.284 "seek_hole": false, 00:32:36.285 "seek_data": false, 00:32:36.285 "copy": true, 00:32:36.285 "nvme_iov_md": false 00:32:36.285 }, 00:32:36.285 "memory_domains": [ 00:32:36.285 { 00:32:36.285 "dma_device_id": "system", 00:32:36.285 "dma_device_type": 1 00:32:36.285 }, 00:32:36.285 { 00:32:36.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:36.285 "dma_device_type": 2 00:32:36.285 } 00:32:36.285 ], 00:32:36.285 "driver_specific": {} 00:32:36.285 }' 00:32:36.285 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:36.285 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:36.285 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:32:36.285 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:36.285 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:32:36.543 17:26:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:32:36.802 [2024-07-23 17:26:32.128891] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:36.802 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:36.803 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:36.803 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:36.803 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:32:37.061 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:37.061 "name": "Existed_Raid", 00:32:37.061 "uuid": "fcb24065-c7d0-432a-8b44-aff3e642f3af", 00:32:37.061 "strip_size_kb": 0, 00:32:37.061 "state": "online", 00:32:37.061 "raid_level": "raid1", 00:32:37.061 "superblock": true, 00:32:37.061 "num_base_bdevs": 2, 00:32:37.061 "num_base_bdevs_discovered": 1, 00:32:37.061 "num_base_bdevs_operational": 1, 00:32:37.061 "base_bdevs_list": [ 00:32:37.061 { 00:32:37.061 "name": null, 00:32:37.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:37.061 "is_configured": false, 00:32:37.061 "data_offset": 256, 00:32:37.061 "data_size": 7936 00:32:37.061 }, 00:32:37.061 { 00:32:37.061 "name": "BaseBdev2", 00:32:37.061 "uuid": "35adddb6-ce3c-4221-9210-bb284e94dc97", 00:32:37.061 "is_configured": true, 00:32:37.061 "data_offset": 256, 00:32:37.061 "data_size": 7936 00:32:37.061 } 00:32:37.061 ] 00:32:37.061 }' 00:32:37.061 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:37.061 17:26:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:37.630 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:32:37.630 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:37.630 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:37.630 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:32:37.888 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:32:37.888 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:32:37.888 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:32:38.146 [2024-07-23 17:26:33.435020] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:32:38.146 [2024-07-23 17:26:33.435113] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:38.146 [2024-07-23 17:26:33.448256] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:38.146 [2024-07-23 17:26:33.448294] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:38.146 [2024-07-23 17:26:33.448305] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d2c100 name Existed_Raid, state offline 00:32:38.146 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:32:38.146 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:32:38.146 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:38.146 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 67502 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 67502 ']' 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 67502 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 67502 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 67502' 00:32:38.405 killing process with pid 67502 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 67502 00:32:38.405 [2024-07-23 17:26:33.784641] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:38.405 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 67502 00:32:38.405 [2024-07-23 17:26:33.785509] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:38.665 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:32:38.665 00:32:38.665 real 0m10.786s 00:32:38.665 user 0m19.213s 00:32:38.665 sys 0m2.019s 00:32:38.665 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:38.665 17:26:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:38.665 ************************************ 00:32:38.665 END TEST raid_state_function_test_sb_md_separate 00:32:38.665 ************************************ 00:32:38.665 17:26:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:32:38.665 17:26:34 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:32:38.665 17:26:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:32:38.665 17:26:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:38.665 17:26:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:38.665 ************************************ 00:32:38.665 START TEST raid_superblock_test_md_separate 00:32:38.665 ************************************ 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=69134 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 69134 /var/tmp/spdk-raid.sock 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 69134 ']' 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:38.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:38.665 17:26:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:38.924 [2024-07-23 17:26:34.122731] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:32:38.924 [2024-07-23 17:26:34.122796] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid69134 ] 00:32:38.924 [2024-07-23 17:26:34.254793] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:38.924 [2024-07-23 17:26:34.308461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:39.183 [2024-07-23 17:26:34.376463] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:39.183 [2024-07-23 17:26:34.376502] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:32:39.750 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:32:40.009 malloc1 00:32:40.009 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:32:40.268 [2024-07-23 17:26:35.537427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:32:40.268 [2024-07-23 17:26:35.537476] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:40.268 [2024-07-23 17:26:35.537496] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17a65a0 00:32:40.268 [2024-07-23 17:26:35.537509] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:40.268 [2024-07-23 17:26:35.539177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:40.268 [2024-07-23 17:26:35.539205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:32:40.268 pt1 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:32:40.268 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:32:40.527 malloc2 00:32:40.527 17:26:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:40.786 [2024-07-23 17:26:36.032293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:40.786 [2024-07-23 17:26:36.032339] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:40.786 [2024-07-23 17:26:36.032357] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179d050 00:32:40.786 [2024-07-23 17:26:36.032375] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:40.786 [2024-07-23 17:26:36.033748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:40.786 [2024-07-23 17:26:36.033776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:40.786 pt2 00:32:40.786 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:32:40.786 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:32:40.786 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:32:41.045 [2024-07-23 17:26:36.272962] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:32:41.045 [2024-07-23 17:26:36.274295] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:41.045 [2024-07-23 17:26:36.274444] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x179dfc0 00:32:41.045 [2024-07-23 17:26:36.274457] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:41.045 [2024-07-23 17:26:36.274527] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1608fd0 00:32:41.045 [2024-07-23 17:26:36.274645] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x179dfc0 00:32:41.045 [2024-07-23 17:26:36.274655] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x179dfc0 00:32:41.045 [2024-07-23 17:26:36.274725] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:41.046 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:41.305 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:41.305 "name": "raid_bdev1", 00:32:41.305 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:41.305 "strip_size_kb": 0, 00:32:41.305 "state": "online", 00:32:41.305 "raid_level": "raid1", 00:32:41.305 "superblock": true, 00:32:41.305 "num_base_bdevs": 2, 00:32:41.305 "num_base_bdevs_discovered": 2, 00:32:41.305 "num_base_bdevs_operational": 2, 00:32:41.305 "base_bdevs_list": [ 00:32:41.305 { 00:32:41.305 "name": "pt1", 00:32:41.305 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:41.305 "is_configured": true, 00:32:41.305 "data_offset": 256, 00:32:41.305 "data_size": 7936 00:32:41.305 }, 00:32:41.305 { 00:32:41.305 "name": "pt2", 00:32:41.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:41.305 "is_configured": true, 00:32:41.305 "data_offset": 256, 00:32:41.305 "data_size": 7936 00:32:41.305 } 00:32:41.305 ] 00:32:41.305 }' 00:32:41.305 17:26:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:41.305 17:26:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:42.241 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:42.500 [2024-07-23 17:26:37.805252] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:42.500 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:42.500 "name": "raid_bdev1", 00:32:42.500 "aliases": [ 00:32:42.500 "a766bf16-3d97-4b0c-af65-bd5e302f9744" 00:32:42.500 ], 00:32:42.500 "product_name": "Raid Volume", 00:32:42.500 "block_size": 4096, 00:32:42.500 "num_blocks": 7936, 00:32:42.500 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:42.500 "md_size": 32, 00:32:42.500 "md_interleave": false, 00:32:42.500 "dif_type": 0, 00:32:42.500 "assigned_rate_limits": { 00:32:42.500 "rw_ios_per_sec": 0, 00:32:42.500 "rw_mbytes_per_sec": 0, 00:32:42.500 "r_mbytes_per_sec": 0, 00:32:42.500 "w_mbytes_per_sec": 0 00:32:42.500 }, 00:32:42.500 "claimed": false, 00:32:42.500 "zoned": false, 00:32:42.500 "supported_io_types": { 00:32:42.500 "read": true, 00:32:42.500 "write": true, 00:32:42.500 "unmap": false, 00:32:42.500 "flush": false, 00:32:42.500 "reset": true, 00:32:42.500 "nvme_admin": false, 00:32:42.500 "nvme_io": false, 00:32:42.500 "nvme_io_md": false, 00:32:42.500 "write_zeroes": true, 00:32:42.500 "zcopy": false, 00:32:42.500 "get_zone_info": false, 00:32:42.500 "zone_management": false, 00:32:42.500 "zone_append": false, 00:32:42.500 "compare": false, 00:32:42.500 "compare_and_write": false, 00:32:42.500 "abort": false, 00:32:42.500 "seek_hole": false, 00:32:42.500 "seek_data": false, 00:32:42.500 "copy": false, 00:32:42.500 "nvme_iov_md": false 00:32:42.500 }, 00:32:42.500 "memory_domains": [ 00:32:42.500 { 00:32:42.500 "dma_device_id": "system", 00:32:42.500 "dma_device_type": 1 00:32:42.500 }, 00:32:42.500 { 00:32:42.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:42.500 "dma_device_type": 2 00:32:42.500 }, 00:32:42.500 { 00:32:42.500 "dma_device_id": "system", 00:32:42.500 "dma_device_type": 1 00:32:42.500 }, 00:32:42.500 { 00:32:42.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:42.500 "dma_device_type": 2 00:32:42.500 } 00:32:42.500 ], 00:32:42.500 "driver_specific": { 00:32:42.500 "raid": { 00:32:42.500 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:42.500 "strip_size_kb": 0, 00:32:42.500 "state": "online", 00:32:42.500 "raid_level": "raid1", 00:32:42.500 "superblock": true, 00:32:42.500 "num_base_bdevs": 2, 00:32:42.500 "num_base_bdevs_discovered": 2, 00:32:42.500 "num_base_bdevs_operational": 2, 00:32:42.500 "base_bdevs_list": [ 00:32:42.500 { 00:32:42.500 "name": "pt1", 00:32:42.500 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:42.500 "is_configured": true, 00:32:42.500 "data_offset": 256, 00:32:42.500 "data_size": 7936 00:32:42.500 }, 00:32:42.500 { 00:32:42.500 "name": "pt2", 00:32:42.500 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:42.500 "is_configured": true, 00:32:42.500 "data_offset": 256, 00:32:42.500 "data_size": 7936 00:32:42.500 } 00:32:42.500 ] 00:32:42.500 } 00:32:42.500 } 00:32:42.500 }' 00:32:42.500 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:42.500 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:32:42.500 pt2' 00:32:42.500 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:42.500 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:32:42.500 17:26:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:43.067 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:43.067 "name": "pt1", 00:32:43.067 "aliases": [ 00:32:43.067 "00000000-0000-0000-0000-000000000001" 00:32:43.067 ], 00:32:43.067 "product_name": "passthru", 00:32:43.067 "block_size": 4096, 00:32:43.067 "num_blocks": 8192, 00:32:43.067 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:43.067 "md_size": 32, 00:32:43.067 "md_interleave": false, 00:32:43.067 "dif_type": 0, 00:32:43.067 "assigned_rate_limits": { 00:32:43.067 "rw_ios_per_sec": 0, 00:32:43.067 "rw_mbytes_per_sec": 0, 00:32:43.067 "r_mbytes_per_sec": 0, 00:32:43.067 "w_mbytes_per_sec": 0 00:32:43.067 }, 00:32:43.067 "claimed": true, 00:32:43.067 "claim_type": "exclusive_write", 00:32:43.067 "zoned": false, 00:32:43.067 "supported_io_types": { 00:32:43.067 "read": true, 00:32:43.067 "write": true, 00:32:43.067 "unmap": true, 00:32:43.067 "flush": true, 00:32:43.067 "reset": true, 00:32:43.067 "nvme_admin": false, 00:32:43.067 "nvme_io": false, 00:32:43.067 "nvme_io_md": false, 00:32:43.067 "write_zeroes": true, 00:32:43.067 "zcopy": true, 00:32:43.067 "get_zone_info": false, 00:32:43.067 "zone_management": false, 00:32:43.067 "zone_append": false, 00:32:43.067 "compare": false, 00:32:43.067 "compare_and_write": false, 00:32:43.067 "abort": true, 00:32:43.067 "seek_hole": false, 00:32:43.067 "seek_data": false, 00:32:43.067 "copy": true, 00:32:43.067 "nvme_iov_md": false 00:32:43.067 }, 00:32:43.067 "memory_domains": [ 00:32:43.067 { 00:32:43.067 "dma_device_id": "system", 00:32:43.067 "dma_device_type": 1 00:32:43.067 }, 00:32:43.067 { 00:32:43.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:43.067 "dma_device_type": 2 00:32:43.067 } 00:32:43.067 ], 00:32:43.067 "driver_specific": { 00:32:43.067 "passthru": { 00:32:43.067 "name": "pt1", 00:32:43.067 "base_bdev_name": "malloc1" 00:32:43.067 } 00:32:43.067 } 00:32:43.067 }' 00:32:43.067 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:43.067 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:43.067 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:32:43.067 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:43.326 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:43.585 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:32:43.585 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:43.585 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:32:43.585 17:26:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:43.844 "name": "pt2", 00:32:43.844 "aliases": [ 00:32:43.844 "00000000-0000-0000-0000-000000000002" 00:32:43.844 ], 00:32:43.844 "product_name": "passthru", 00:32:43.844 "block_size": 4096, 00:32:43.844 "num_blocks": 8192, 00:32:43.844 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:43.844 "md_size": 32, 00:32:43.844 "md_interleave": false, 00:32:43.844 "dif_type": 0, 00:32:43.844 "assigned_rate_limits": { 00:32:43.844 "rw_ios_per_sec": 0, 00:32:43.844 "rw_mbytes_per_sec": 0, 00:32:43.844 "r_mbytes_per_sec": 0, 00:32:43.844 "w_mbytes_per_sec": 0 00:32:43.844 }, 00:32:43.844 "claimed": true, 00:32:43.844 "claim_type": "exclusive_write", 00:32:43.844 "zoned": false, 00:32:43.844 "supported_io_types": { 00:32:43.844 "read": true, 00:32:43.844 "write": true, 00:32:43.844 "unmap": true, 00:32:43.844 "flush": true, 00:32:43.844 "reset": true, 00:32:43.844 "nvme_admin": false, 00:32:43.844 "nvme_io": false, 00:32:43.844 "nvme_io_md": false, 00:32:43.844 "write_zeroes": true, 00:32:43.844 "zcopy": true, 00:32:43.844 "get_zone_info": false, 00:32:43.844 "zone_management": false, 00:32:43.844 "zone_append": false, 00:32:43.844 "compare": false, 00:32:43.844 "compare_and_write": false, 00:32:43.844 "abort": true, 00:32:43.844 "seek_hole": false, 00:32:43.844 "seek_data": false, 00:32:43.844 "copy": true, 00:32:43.844 "nvme_iov_md": false 00:32:43.844 }, 00:32:43.844 "memory_domains": [ 00:32:43.844 { 00:32:43.844 "dma_device_id": "system", 00:32:43.844 "dma_device_type": 1 00:32:43.844 }, 00:32:43.844 { 00:32:43.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:43.844 "dma_device_type": 2 00:32:43.844 } 00:32:43.844 ], 00:32:43.844 "driver_specific": { 00:32:43.844 "passthru": { 00:32:43.844 "name": "pt2", 00:32:43.844 "base_bdev_name": "malloc2" 00:32:43.844 } 00:32:43.844 } 00:32:43.844 }' 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:32:43.844 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:44.102 17:26:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:32:44.670 [2024-07-23 17:26:40.003095] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:44.670 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a766bf16-3d97-4b0c-af65-bd5e302f9744 00:32:44.670 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z a766bf16-3d97-4b0c-af65-bd5e302f9744 ']' 00:32:44.670 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:44.929 [2024-07-23 17:26:40.255509] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:44.929 [2024-07-23 17:26:40.255537] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:44.929 [2024-07-23 17:26:40.255594] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:44.929 [2024-07-23 17:26:40.255650] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:44.929 [2024-07-23 17:26:40.255663] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179dfc0 name raid_bdev1, state offline 00:32:44.929 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:44.929 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:32:45.497 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:32:45.497 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:32:45.497 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:32:45.497 17:26:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:32:45.756 17:26:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:32:45.756 17:26:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:32:46.015 17:26:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:32:46.015 17:26:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:46.582 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:32:46.583 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:46.583 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:32:46.583 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:46.583 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:32:46.583 17:26:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:32:46.841 [2024-07-23 17:26:42.036130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:32:46.841 [2024-07-23 17:26:42.037499] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:32:46.841 [2024-07-23 17:26:42.037555] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:32:46.841 [2024-07-23 17:26:42.037594] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:32:46.841 [2024-07-23 17:26:42.037613] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:46.841 [2024-07-23 17:26:42.037623] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179ed10 name raid_bdev1, state configuring 00:32:46.841 request: 00:32:46.841 { 00:32:46.841 "name": "raid_bdev1", 00:32:46.841 "raid_level": "raid1", 00:32:46.841 "base_bdevs": [ 00:32:46.841 "malloc1", 00:32:46.841 "malloc2" 00:32:46.841 ], 00:32:46.841 "superblock": false, 00:32:46.841 "method": "bdev_raid_create", 00:32:46.841 "req_id": 1 00:32:46.841 } 00:32:46.841 Got JSON-RPC error response 00:32:46.841 response: 00:32:46.841 { 00:32:46.841 "code": -17, 00:32:46.841 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:32:46.841 } 00:32:46.841 17:26:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:32:46.841 17:26:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:32:46.841 17:26:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:32:46.841 17:26:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:32:46.841 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:46.841 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:32:47.100 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:32:47.100 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:32:47.100 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:32:47.360 [2024-07-23 17:26:42.581514] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:32:47.360 [2024-07-23 17:26:42.581565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:47.360 [2024-07-23 17:26:42.581584] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16099e0 00:32:47.360 [2024-07-23 17:26:42.581597] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:47.360 [2024-07-23 17:26:42.583088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:47.360 [2024-07-23 17:26:42.583117] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:32:47.360 [2024-07-23 17:26:42.583166] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:32:47.360 [2024-07-23 17:26:42.583191] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:32:47.360 pt1 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:47.360 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:47.618 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:47.618 "name": "raid_bdev1", 00:32:47.618 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:47.618 "strip_size_kb": 0, 00:32:47.618 "state": "configuring", 00:32:47.618 "raid_level": "raid1", 00:32:47.618 "superblock": true, 00:32:47.618 "num_base_bdevs": 2, 00:32:47.618 "num_base_bdevs_discovered": 1, 00:32:47.618 "num_base_bdevs_operational": 2, 00:32:47.618 "base_bdevs_list": [ 00:32:47.618 { 00:32:47.618 "name": "pt1", 00:32:47.618 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:47.618 "is_configured": true, 00:32:47.618 "data_offset": 256, 00:32:47.618 "data_size": 7936 00:32:47.618 }, 00:32:47.618 { 00:32:47.618 "name": null, 00:32:47.619 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:47.619 "is_configured": false, 00:32:47.619 "data_offset": 256, 00:32:47.619 "data_size": 7936 00:32:47.619 } 00:32:47.619 ] 00:32:47.619 }' 00:32:47.619 17:26:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:47.619 17:26:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:48.552 17:26:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:32:48.552 17:26:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:32:48.552 17:26:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:32:48.552 17:26:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:48.811 [2024-07-23 17:26:44.029371] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:48.811 [2024-07-23 17:26:44.029425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:48.811 [2024-07-23 17:26:44.029452] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17a06e0 00:32:48.811 [2024-07-23 17:26:44.029465] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:48.811 [2024-07-23 17:26:44.029661] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:48.811 [2024-07-23 17:26:44.029676] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:48.811 [2024-07-23 17:26:44.029722] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:32:48.811 [2024-07-23 17:26:44.029741] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:48.811 [2024-07-23 17:26:44.029828] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17a1060 00:32:48.811 [2024-07-23 17:26:44.029838] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:48.811 [2024-07-23 17:26:44.029908] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a0c70 00:32:48.811 [2024-07-23 17:26:44.030011] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17a1060 00:32:48.811 [2024-07-23 17:26:44.030021] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17a1060 00:32:48.811 [2024-07-23 17:26:44.030090] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:48.811 pt2 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:48.811 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:49.378 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:49.378 "name": "raid_bdev1", 00:32:49.378 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:49.378 "strip_size_kb": 0, 00:32:49.378 "state": "online", 00:32:49.378 "raid_level": "raid1", 00:32:49.378 "superblock": true, 00:32:49.378 "num_base_bdevs": 2, 00:32:49.378 "num_base_bdevs_discovered": 2, 00:32:49.378 "num_base_bdevs_operational": 2, 00:32:49.378 "base_bdevs_list": [ 00:32:49.378 { 00:32:49.378 "name": "pt1", 00:32:49.378 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:49.378 "is_configured": true, 00:32:49.378 "data_offset": 256, 00:32:49.378 "data_size": 7936 00:32:49.378 }, 00:32:49.378 { 00:32:49.378 "name": "pt2", 00:32:49.378 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:49.378 "is_configured": true, 00:32:49.378 "data_offset": 256, 00:32:49.378 "data_size": 7936 00:32:49.378 } 00:32:49.378 ] 00:32:49.378 }' 00:32:49.378 17:26:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:49.378 17:26:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:32:50.354 [2024-07-23 17:26:45.605819] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:32:50.354 "name": "raid_bdev1", 00:32:50.354 "aliases": [ 00:32:50.354 "a766bf16-3d97-4b0c-af65-bd5e302f9744" 00:32:50.354 ], 00:32:50.354 "product_name": "Raid Volume", 00:32:50.354 "block_size": 4096, 00:32:50.354 "num_blocks": 7936, 00:32:50.354 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:50.354 "md_size": 32, 00:32:50.354 "md_interleave": false, 00:32:50.354 "dif_type": 0, 00:32:50.354 "assigned_rate_limits": { 00:32:50.354 "rw_ios_per_sec": 0, 00:32:50.354 "rw_mbytes_per_sec": 0, 00:32:50.354 "r_mbytes_per_sec": 0, 00:32:50.354 "w_mbytes_per_sec": 0 00:32:50.354 }, 00:32:50.354 "claimed": false, 00:32:50.354 "zoned": false, 00:32:50.354 "supported_io_types": { 00:32:50.354 "read": true, 00:32:50.354 "write": true, 00:32:50.354 "unmap": false, 00:32:50.354 "flush": false, 00:32:50.354 "reset": true, 00:32:50.354 "nvme_admin": false, 00:32:50.354 "nvme_io": false, 00:32:50.354 "nvme_io_md": false, 00:32:50.354 "write_zeroes": true, 00:32:50.354 "zcopy": false, 00:32:50.354 "get_zone_info": false, 00:32:50.354 "zone_management": false, 00:32:50.354 "zone_append": false, 00:32:50.354 "compare": false, 00:32:50.354 "compare_and_write": false, 00:32:50.354 "abort": false, 00:32:50.354 "seek_hole": false, 00:32:50.354 "seek_data": false, 00:32:50.354 "copy": false, 00:32:50.354 "nvme_iov_md": false 00:32:50.354 }, 00:32:50.354 "memory_domains": [ 00:32:50.354 { 00:32:50.354 "dma_device_id": "system", 00:32:50.354 "dma_device_type": 1 00:32:50.354 }, 00:32:50.354 { 00:32:50.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:50.354 "dma_device_type": 2 00:32:50.354 }, 00:32:50.354 { 00:32:50.354 "dma_device_id": "system", 00:32:50.354 "dma_device_type": 1 00:32:50.354 }, 00:32:50.354 { 00:32:50.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:50.354 "dma_device_type": 2 00:32:50.354 } 00:32:50.354 ], 00:32:50.354 "driver_specific": { 00:32:50.354 "raid": { 00:32:50.354 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:50.354 "strip_size_kb": 0, 00:32:50.354 "state": "online", 00:32:50.354 "raid_level": "raid1", 00:32:50.354 "superblock": true, 00:32:50.354 "num_base_bdevs": 2, 00:32:50.354 "num_base_bdevs_discovered": 2, 00:32:50.354 "num_base_bdevs_operational": 2, 00:32:50.354 "base_bdevs_list": [ 00:32:50.354 { 00:32:50.354 "name": "pt1", 00:32:50.354 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:50.354 "is_configured": true, 00:32:50.354 "data_offset": 256, 00:32:50.354 "data_size": 7936 00:32:50.354 }, 00:32:50.354 { 00:32:50.354 "name": "pt2", 00:32:50.354 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:50.354 "is_configured": true, 00:32:50.354 "data_offset": 256, 00:32:50.354 "data_size": 7936 00:32:50.354 } 00:32:50.354 ] 00:32:50.354 } 00:32:50.354 } 00:32:50.354 }' 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:32:50.354 pt2' 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:32:50.354 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:50.613 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:50.613 "name": "pt1", 00:32:50.613 "aliases": [ 00:32:50.613 "00000000-0000-0000-0000-000000000001" 00:32:50.613 ], 00:32:50.613 "product_name": "passthru", 00:32:50.613 "block_size": 4096, 00:32:50.613 "num_blocks": 8192, 00:32:50.613 "uuid": "00000000-0000-0000-0000-000000000001", 00:32:50.613 "md_size": 32, 00:32:50.613 "md_interleave": false, 00:32:50.613 "dif_type": 0, 00:32:50.613 "assigned_rate_limits": { 00:32:50.613 "rw_ios_per_sec": 0, 00:32:50.613 "rw_mbytes_per_sec": 0, 00:32:50.613 "r_mbytes_per_sec": 0, 00:32:50.613 "w_mbytes_per_sec": 0 00:32:50.613 }, 00:32:50.613 "claimed": true, 00:32:50.614 "claim_type": "exclusive_write", 00:32:50.614 "zoned": false, 00:32:50.614 "supported_io_types": { 00:32:50.614 "read": true, 00:32:50.614 "write": true, 00:32:50.614 "unmap": true, 00:32:50.614 "flush": true, 00:32:50.614 "reset": true, 00:32:50.614 "nvme_admin": false, 00:32:50.614 "nvme_io": false, 00:32:50.614 "nvme_io_md": false, 00:32:50.614 "write_zeroes": true, 00:32:50.614 "zcopy": true, 00:32:50.614 "get_zone_info": false, 00:32:50.614 "zone_management": false, 00:32:50.614 "zone_append": false, 00:32:50.614 "compare": false, 00:32:50.614 "compare_and_write": false, 00:32:50.614 "abort": true, 00:32:50.614 "seek_hole": false, 00:32:50.614 "seek_data": false, 00:32:50.614 "copy": true, 00:32:50.614 "nvme_iov_md": false 00:32:50.614 }, 00:32:50.614 "memory_domains": [ 00:32:50.614 { 00:32:50.614 "dma_device_id": "system", 00:32:50.614 "dma_device_type": 1 00:32:50.614 }, 00:32:50.614 { 00:32:50.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:50.614 "dma_device_type": 2 00:32:50.614 } 00:32:50.614 ], 00:32:50.614 "driver_specific": { 00:32:50.614 "passthru": { 00:32:50.614 "name": "pt1", 00:32:50.614 "base_bdev_name": "malloc1" 00:32:50.614 } 00:32:50.614 } 00:32:50.614 }' 00:32:50.614 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:50.614 17:26:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:50.614 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:32:50.614 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:32:50.872 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:32:51.130 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:32:51.130 "name": "pt2", 00:32:51.130 "aliases": [ 00:32:51.130 "00000000-0000-0000-0000-000000000002" 00:32:51.130 ], 00:32:51.130 "product_name": "passthru", 00:32:51.130 "block_size": 4096, 00:32:51.130 "num_blocks": 8192, 00:32:51.130 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:51.130 "md_size": 32, 00:32:51.130 "md_interleave": false, 00:32:51.130 "dif_type": 0, 00:32:51.130 "assigned_rate_limits": { 00:32:51.130 "rw_ios_per_sec": 0, 00:32:51.130 "rw_mbytes_per_sec": 0, 00:32:51.130 "r_mbytes_per_sec": 0, 00:32:51.130 "w_mbytes_per_sec": 0 00:32:51.130 }, 00:32:51.130 "claimed": true, 00:32:51.130 "claim_type": "exclusive_write", 00:32:51.130 "zoned": false, 00:32:51.130 "supported_io_types": { 00:32:51.130 "read": true, 00:32:51.130 "write": true, 00:32:51.130 "unmap": true, 00:32:51.130 "flush": true, 00:32:51.130 "reset": true, 00:32:51.130 "nvme_admin": false, 00:32:51.130 "nvme_io": false, 00:32:51.130 "nvme_io_md": false, 00:32:51.130 "write_zeroes": true, 00:32:51.130 "zcopy": true, 00:32:51.130 "get_zone_info": false, 00:32:51.130 "zone_management": false, 00:32:51.130 "zone_append": false, 00:32:51.130 "compare": false, 00:32:51.130 "compare_and_write": false, 00:32:51.130 "abort": true, 00:32:51.130 "seek_hole": false, 00:32:51.130 "seek_data": false, 00:32:51.130 "copy": true, 00:32:51.130 "nvme_iov_md": false 00:32:51.130 }, 00:32:51.130 "memory_domains": [ 00:32:51.130 { 00:32:51.130 "dma_device_id": "system", 00:32:51.130 "dma_device_type": 1 00:32:51.130 }, 00:32:51.130 { 00:32:51.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:51.130 "dma_device_type": 2 00:32:51.130 } 00:32:51.130 ], 00:32:51.130 "driver_specific": { 00:32:51.130 "passthru": { 00:32:51.130 "name": "pt2", 00:32:51.130 "base_bdev_name": "malloc2" 00:32:51.130 } 00:32:51.130 } 00:32:51.130 }' 00:32:51.130 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:51.130 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:51.389 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:32:51.648 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:32:51.648 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:51.648 17:26:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:32:51.648 [2024-07-23 17:26:47.057815] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' a766bf16-3d97-4b0c-af65-bd5e302f9744 '!=' a766bf16-3d97-4b0c-af65-bd5e302f9744 ']' 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:32:51.907 [2024-07-23 17:26:47.242077] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:51.907 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:52.166 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:52.166 "name": "raid_bdev1", 00:32:52.166 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:52.166 "strip_size_kb": 0, 00:32:52.166 "state": "online", 00:32:52.166 "raid_level": "raid1", 00:32:52.166 "superblock": true, 00:32:52.166 "num_base_bdevs": 2, 00:32:52.166 "num_base_bdevs_discovered": 1, 00:32:52.166 "num_base_bdevs_operational": 1, 00:32:52.166 "base_bdevs_list": [ 00:32:52.166 { 00:32:52.166 "name": null, 00:32:52.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:52.166 "is_configured": false, 00:32:52.166 "data_offset": 256, 00:32:52.166 "data_size": 7936 00:32:52.166 }, 00:32:52.166 { 00:32:52.166 "name": "pt2", 00:32:52.166 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:52.166 "is_configured": true, 00:32:52.166 "data_offset": 256, 00:32:52.166 "data_size": 7936 00:32:52.166 } 00:32:52.166 ] 00:32:52.166 }' 00:32:52.166 17:26:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:52.166 17:26:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:52.732 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:52.991 [2024-07-23 17:26:48.284817] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:52.991 [2024-07-23 17:26:48.284844] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:52.991 [2024-07-23 17:26:48.284900] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:52.991 [2024-07-23 17:26:48.284945] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:52.991 [2024-07-23 17:26:48.284956] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17a1060 name raid_bdev1, state offline 00:32:52.991 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:32:52.991 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:53.250 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:32:53.250 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:32:53.250 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:32:53.250 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:32:53.250 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:32:53.508 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:32:53.508 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:32:53.508 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:32:53.508 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:32:53.508 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:32:53.509 17:26:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:32:53.768 [2024-07-23 17:26:49.030745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:32:53.768 [2024-07-23 17:26:49.030786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:53.768 [2024-07-23 17:26:49.030802] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17a35a0 00:32:53.768 [2024-07-23 17:26:49.030814] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:53.768 [2024-07-23 17:26:49.032227] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:53.768 [2024-07-23 17:26:49.032254] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:32:53.768 [2024-07-23 17:26:49.032300] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:32:53.768 [2024-07-23 17:26:49.032325] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:53.768 [2024-07-23 17:26:49.032399] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17a18c0 00:32:53.768 [2024-07-23 17:26:49.032410] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:53.768 [2024-07-23 17:26:49.032463] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x179d9c0 00:32:53.768 [2024-07-23 17:26:49.032558] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17a18c0 00:32:53.768 [2024-07-23 17:26:49.032577] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17a18c0 00:32:53.768 [2024-07-23 17:26:49.032643] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:53.768 pt2 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:53.768 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:54.027 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:54.027 "name": "raid_bdev1", 00:32:54.027 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:54.027 "strip_size_kb": 0, 00:32:54.027 "state": "online", 00:32:54.027 "raid_level": "raid1", 00:32:54.027 "superblock": true, 00:32:54.027 "num_base_bdevs": 2, 00:32:54.027 "num_base_bdevs_discovered": 1, 00:32:54.027 "num_base_bdevs_operational": 1, 00:32:54.027 "base_bdevs_list": [ 00:32:54.027 { 00:32:54.027 "name": null, 00:32:54.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:54.027 "is_configured": false, 00:32:54.027 "data_offset": 256, 00:32:54.027 "data_size": 7936 00:32:54.027 }, 00:32:54.027 { 00:32:54.027 "name": "pt2", 00:32:54.027 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:54.027 "is_configured": true, 00:32:54.027 "data_offset": 256, 00:32:54.027 "data_size": 7936 00:32:54.027 } 00:32:54.027 ] 00:32:54.027 }' 00:32:54.027 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:54.027 17:26:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:54.594 17:26:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:32:54.852 [2024-07-23 17:26:50.121671] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:54.852 [2024-07-23 17:26:50.121704] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:32:54.852 [2024-07-23 17:26:50.121753] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:54.852 [2024-07-23 17:26:50.121796] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:54.852 [2024-07-23 17:26:50.121807] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17a18c0 name raid_bdev1, state offline 00:32:54.852 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:54.852 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:32:55.111 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:32:55.111 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:32:55.111 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:32:55.111 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:32:55.370 [2024-07-23 17:26:50.626995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:32:55.370 [2024-07-23 17:26:50.627040] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:55.370 [2024-07-23 17:26:50.627058] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179fe50 00:32:55.370 [2024-07-23 17:26:50.627070] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:55.370 [2024-07-23 17:26:50.628484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:55.370 [2024-07-23 17:26:50.628511] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:32:55.370 [2024-07-23 17:26:50.628556] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:32:55.370 [2024-07-23 17:26:50.628581] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:32:55.370 [2024-07-23 17:26:50.628670] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:32:55.370 [2024-07-23 17:26:50.628683] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:32:55.370 [2024-07-23 17:26:50.628698] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17a23b0 name raid_bdev1, state configuring 00:32:55.370 [2024-07-23 17:26:50.628720] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:32:55.370 [2024-07-23 17:26:50.628772] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17a1560 00:32:55.370 [2024-07-23 17:26:50.628782] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:55.370 [2024-07-23 17:26:50.628835] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a0c90 00:32:55.370 [2024-07-23 17:26:50.628942] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17a1560 00:32:55.370 [2024-07-23 17:26:50.628953] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17a1560 00:32:55.370 [2024-07-23 17:26:50.629022] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:55.370 pt1 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:55.370 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:32:55.629 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:32:55.629 "name": "raid_bdev1", 00:32:55.629 "uuid": "a766bf16-3d97-4b0c-af65-bd5e302f9744", 00:32:55.629 "strip_size_kb": 0, 00:32:55.629 "state": "online", 00:32:55.629 "raid_level": "raid1", 00:32:55.629 "superblock": true, 00:32:55.629 "num_base_bdevs": 2, 00:32:55.629 "num_base_bdevs_discovered": 1, 00:32:55.629 "num_base_bdevs_operational": 1, 00:32:55.629 "base_bdevs_list": [ 00:32:55.629 { 00:32:55.629 "name": null, 00:32:55.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:32:55.629 "is_configured": false, 00:32:55.629 "data_offset": 256, 00:32:55.629 "data_size": 7936 00:32:55.629 }, 00:32:55.629 { 00:32:55.629 "name": "pt2", 00:32:55.629 "uuid": "00000000-0000-0000-0000-000000000002", 00:32:55.629 "is_configured": true, 00:32:55.629 "data_offset": 256, 00:32:55.629 "data_size": 7936 00:32:55.629 } 00:32:55.629 ] 00:32:55.629 }' 00:32:55.629 17:26:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:32:55.629 17:26:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:56.196 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:32:56.196 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:32:56.455 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:32:56.455 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:32:56.455 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:32:56.714 [2024-07-23 17:26:51.878553] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:32:56.714 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' a766bf16-3d97-4b0c-af65-bd5e302f9744 '!=' a766bf16-3d97-4b0c-af65-bd5e302f9744 ']' 00:32:56.714 17:26:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 69134 00:32:56.714 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 69134 ']' 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 69134 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 69134 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 69134' 00:32:56.715 killing process with pid 69134 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 69134 00:32:56.715 [2024-07-23 17:26:51.949825] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:32:56.715 [2024-07-23 17:26:51.949876] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:32:56.715 [2024-07-23 17:26:51.949926] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:32:56.715 [2024-07-23 17:26:51.949938] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17a1560 name raid_bdev1, state offline 00:32:56.715 17:26:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 69134 00:32:56.715 [2024-07-23 17:26:51.975445] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:32:56.974 17:26:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:32:56.974 00:32:56.974 real 0m18.126s 00:32:56.974 user 0m33.078s 00:32:56.974 sys 0m3.211s 00:32:56.974 17:26:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:56.974 17:26:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:56.974 ************************************ 00:32:56.974 END TEST raid_superblock_test_md_separate 00:32:56.974 ************************************ 00:32:56.974 17:26:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:32:56.974 17:26:52 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:32:56.974 17:26:52 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:32:56.974 17:26:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:56.974 17:26:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:56.974 17:26:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:32:56.974 ************************************ 00:32:56.974 START TEST raid_rebuild_test_sb_md_separate 00:32:56.974 ************************************ 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:32:56.974 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=71738 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 71738 /var/tmp/spdk-raid.sock 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 71738 ']' 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:32:56.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:56.975 17:26:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:32:57.235 [2024-07-23 17:26:52.396886] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:32:57.235 [2024-07-23 17:26:52.397037] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71738 ] 00:32:57.235 I/O size of 3145728 is greater than zero copy threshold (65536). 00:32:57.235 Zero copy mechanism will not be used. 00:32:57.235 [2024-07-23 17:26:52.597843] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:57.235 [2024-07-23 17:26:52.647843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:57.494 [2024-07-23 17:26:52.704039] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:57.494 [2024-07-23 17:26:52.704067] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:32:58.062 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:58.062 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:32:58.062 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:32:58.062 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:32:58.320 BaseBdev1_malloc 00:32:58.320 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:32:58.579 [2024-07-23 17:26:53.760829] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:32:58.579 [2024-07-23 17:26:53.760876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:58.579 [2024-07-23 17:26:53.760910] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28bf6a0 00:32:58.579 [2024-07-23 17:26:53.760923] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:58.579 [2024-07-23 17:26:53.762486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:58.579 [2024-07-23 17:26:53.762513] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:32:58.579 BaseBdev1 00:32:58.579 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:32:58.579 17:26:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:32:58.838 BaseBdev2_malloc 00:32:58.838 17:26:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:32:58.838 [2024-07-23 17:26:54.248575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:32:58.838 [2024-07-23 17:26:54.248621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:58.838 [2024-07-23 17:26:54.248641] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b5850 00:32:58.838 [2024-07-23 17:26:54.248654] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:58.838 [2024-07-23 17:26:54.250017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:58.838 [2024-07-23 17:26:54.250044] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:32:58.838 BaseBdev2 00:32:59.097 17:26:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:32:59.097 spare_malloc 00:32:59.355 17:26:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:32:59.355 spare_delay 00:32:59.355 17:26:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:32:59.614 [2024-07-23 17:26:54.967735] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:32:59.614 [2024-07-23 17:26:54.967779] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:59.614 [2024-07-23 17:26:54.967801] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28ba230 00:32:59.614 [2024-07-23 17:26:54.967813] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:59.614 [2024-07-23 17:26:54.969257] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:59.614 [2024-07-23 17:26:54.969285] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:32:59.614 spare 00:32:59.614 17:26:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:32:59.872 [2024-07-23 17:26:55.212419] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:32:59.872 [2024-07-23 17:26:55.213746] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:32:59.872 [2024-07-23 17:26:55.213910] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28b9970 00:32:59.872 [2024-07-23 17:26:55.213924] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:32:59.872 [2024-07-23 17:26:55.214002] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2721e60 00:32:59.873 [2024-07-23 17:26:55.214115] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28b9970 00:32:59.873 [2024-07-23 17:26:55.214125] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28b9970 00:32:59.873 [2024-07-23 17:26:55.214194] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:32:59.873 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:00.131 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:00.131 "name": "raid_bdev1", 00:33:00.131 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:00.131 "strip_size_kb": 0, 00:33:00.131 "state": "online", 00:33:00.131 "raid_level": "raid1", 00:33:00.131 "superblock": true, 00:33:00.131 "num_base_bdevs": 2, 00:33:00.131 "num_base_bdevs_discovered": 2, 00:33:00.131 "num_base_bdevs_operational": 2, 00:33:00.131 "base_bdevs_list": [ 00:33:00.131 { 00:33:00.131 "name": "BaseBdev1", 00:33:00.131 "uuid": "b42f9c0b-b3a3-5b39-9a18-67b65299f621", 00:33:00.131 "is_configured": true, 00:33:00.131 "data_offset": 256, 00:33:00.131 "data_size": 7936 00:33:00.131 }, 00:33:00.131 { 00:33:00.131 "name": "BaseBdev2", 00:33:00.131 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:00.131 "is_configured": true, 00:33:00.131 "data_offset": 256, 00:33:00.131 "data_size": 7936 00:33:00.131 } 00:33:00.131 ] 00:33:00.131 }' 00:33:00.131 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:00.131 17:26:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:00.699 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:00.699 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:33:00.958 [2024-07-23 17:26:56.279461] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:00.958 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:33:00.959 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:00.959 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:33:01.217 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:33:01.476 [2024-07-23 17:26:56.780597] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28bdb70 00:33:01.476 /dev/nbd0 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:01.476 1+0 records in 00:33:01.476 1+0 records out 00:33:01.476 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244013 s, 16.8 MB/s 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:33:01.476 17:26:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:33:02.412 7936+0 records in 00:33:02.412 7936+0 records out 00:33:02.412 32505856 bytes (33 MB, 31 MiB) copied, 0.75662 s, 43.0 MB/s 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:02.412 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:33:02.670 [2024-07-23 17:26:57.868081] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:33:02.670 17:26:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:33:02.929 [2024-07-23 17:26:58.100739] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:02.929 "name": "raid_bdev1", 00:33:02.929 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:02.929 "strip_size_kb": 0, 00:33:02.929 "state": "online", 00:33:02.929 "raid_level": "raid1", 00:33:02.929 "superblock": true, 00:33:02.929 "num_base_bdevs": 2, 00:33:02.929 "num_base_bdevs_discovered": 1, 00:33:02.929 "num_base_bdevs_operational": 1, 00:33:02.929 "base_bdevs_list": [ 00:33:02.929 { 00:33:02.929 "name": null, 00:33:02.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:02.929 "is_configured": false, 00:33:02.929 "data_offset": 256, 00:33:02.929 "data_size": 7936 00:33:02.929 }, 00:33:02.929 { 00:33:02.929 "name": "BaseBdev2", 00:33:02.929 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:02.929 "is_configured": true, 00:33:02.929 "data_offset": 256, 00:33:02.929 "data_size": 7936 00:33:02.929 } 00:33:02.929 ] 00:33:02.929 }' 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:02.929 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:03.498 17:26:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:33:03.757 [2024-07-23 17:26:59.063369] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:33:03.757 [2024-07-23 17:26:59.065643] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2722450 00:33:03.757 [2024-07-23 17:26:59.067821] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:33:03.757 17:26:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:04.692 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:04.951 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:04.951 "name": "raid_bdev1", 00:33:04.951 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:04.951 "strip_size_kb": 0, 00:33:04.951 "state": "online", 00:33:04.951 "raid_level": "raid1", 00:33:04.951 "superblock": true, 00:33:04.951 "num_base_bdevs": 2, 00:33:04.951 "num_base_bdevs_discovered": 2, 00:33:04.951 "num_base_bdevs_operational": 2, 00:33:04.951 "process": { 00:33:04.951 "type": "rebuild", 00:33:04.951 "target": "spare", 00:33:04.951 "progress": { 00:33:04.951 "blocks": 2816, 00:33:04.951 "percent": 35 00:33:04.951 } 00:33:04.951 }, 00:33:04.951 "base_bdevs_list": [ 00:33:04.951 { 00:33:04.951 "name": "spare", 00:33:04.951 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:04.951 "is_configured": true, 00:33:04.951 "data_offset": 256, 00:33:04.951 "data_size": 7936 00:33:04.951 }, 00:33:04.951 { 00:33:04.951 "name": "BaseBdev2", 00:33:04.951 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:04.951 "is_configured": true, 00:33:04.951 "data_offset": 256, 00:33:04.951 "data_size": 7936 00:33:04.951 } 00:33:04.951 ] 00:33:04.951 }' 00:33:04.951 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:04.951 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:33:04.951 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:05.210 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:33:05.210 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:33:05.210 [2024-07-23 17:27:00.604628] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:05.469 [2024-07-23 17:27:00.680614] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:33:05.469 [2024-07-23 17:27:00.680660] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:05.469 [2024-07-23 17:27:00.680675] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:05.469 [2024-07-23 17:27:00.680684] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:05.469 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:05.728 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:05.728 "name": "raid_bdev1", 00:33:05.728 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:05.728 "strip_size_kb": 0, 00:33:05.728 "state": "online", 00:33:05.728 "raid_level": "raid1", 00:33:05.728 "superblock": true, 00:33:05.728 "num_base_bdevs": 2, 00:33:05.728 "num_base_bdevs_discovered": 1, 00:33:05.728 "num_base_bdevs_operational": 1, 00:33:05.728 "base_bdevs_list": [ 00:33:05.728 { 00:33:05.728 "name": null, 00:33:05.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:05.728 "is_configured": false, 00:33:05.728 "data_offset": 256, 00:33:05.728 "data_size": 7936 00:33:05.728 }, 00:33:05.728 { 00:33:05.728 "name": "BaseBdev2", 00:33:05.728 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:05.728 "is_configured": true, 00:33:05.728 "data_offset": 256, 00:33:05.728 "data_size": 7936 00:33:05.728 } 00:33:05.728 ] 00:33:05.728 }' 00:33:05.728 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:05.728 17:27:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:06.295 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:06.554 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:06.554 "name": "raid_bdev1", 00:33:06.554 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:06.554 "strip_size_kb": 0, 00:33:06.554 "state": "online", 00:33:06.554 "raid_level": "raid1", 00:33:06.554 "superblock": true, 00:33:06.554 "num_base_bdevs": 2, 00:33:06.554 "num_base_bdevs_discovered": 1, 00:33:06.554 "num_base_bdevs_operational": 1, 00:33:06.554 "base_bdevs_list": [ 00:33:06.554 { 00:33:06.554 "name": null, 00:33:06.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:06.554 "is_configured": false, 00:33:06.554 "data_offset": 256, 00:33:06.554 "data_size": 7936 00:33:06.554 }, 00:33:06.554 { 00:33:06.554 "name": "BaseBdev2", 00:33:06.554 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:06.554 "is_configured": true, 00:33:06.554 "data_offset": 256, 00:33:06.554 "data_size": 7936 00:33:06.554 } 00:33:06.554 ] 00:33:06.554 }' 00:33:06.554 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:06.554 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:33:06.554 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:06.554 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:33:06.554 17:27:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:33:07.120 [2024-07-23 17:27:02.432575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:33:07.121 [2024-07-23 17:27:02.434810] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x282ce60 00:33:07.121 [2024-07-23 17:27:02.436247] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:33:07.121 17:27:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:33:08.071 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:33:08.072 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:08.072 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:33:08.072 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:33:08.072 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:08.072 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:08.072 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:08.641 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:08.641 "name": "raid_bdev1", 00:33:08.641 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:08.641 "strip_size_kb": 0, 00:33:08.641 "state": "online", 00:33:08.641 "raid_level": "raid1", 00:33:08.641 "superblock": true, 00:33:08.641 "num_base_bdevs": 2, 00:33:08.641 "num_base_bdevs_discovered": 2, 00:33:08.641 "num_base_bdevs_operational": 2, 00:33:08.641 "process": { 00:33:08.641 "type": "rebuild", 00:33:08.641 "target": "spare", 00:33:08.641 "progress": { 00:33:08.641 "blocks": 3840, 00:33:08.641 "percent": 48 00:33:08.641 } 00:33:08.641 }, 00:33:08.641 "base_bdevs_list": [ 00:33:08.641 { 00:33:08.641 "name": "spare", 00:33:08.641 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:08.641 "is_configured": true, 00:33:08.641 "data_offset": 256, 00:33:08.641 "data_size": 7936 00:33:08.641 }, 00:33:08.641 { 00:33:08.641 "name": "BaseBdev2", 00:33:08.641 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:08.641 "is_configured": true, 00:33:08.641 "data_offset": 256, 00:33:08.641 "data_size": 7936 00:33:08.641 } 00:33:08.641 ] 00:33:08.641 }' 00:33:08.641 17:27:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:08.641 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:33:08.641 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:33:08.899 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1124 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:08.899 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:09.157 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:09.157 "name": "raid_bdev1", 00:33:09.157 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:09.157 "strip_size_kb": 0, 00:33:09.157 "state": "online", 00:33:09.157 "raid_level": "raid1", 00:33:09.157 "superblock": true, 00:33:09.157 "num_base_bdevs": 2, 00:33:09.157 "num_base_bdevs_discovered": 2, 00:33:09.157 "num_base_bdevs_operational": 2, 00:33:09.157 "process": { 00:33:09.157 "type": "rebuild", 00:33:09.157 "target": "spare", 00:33:09.157 "progress": { 00:33:09.157 "blocks": 4864, 00:33:09.157 "percent": 61 00:33:09.157 } 00:33:09.157 }, 00:33:09.157 "base_bdevs_list": [ 00:33:09.157 { 00:33:09.157 "name": "spare", 00:33:09.157 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:09.157 "is_configured": true, 00:33:09.157 "data_offset": 256, 00:33:09.157 "data_size": 7936 00:33:09.157 }, 00:33:09.157 { 00:33:09.157 "name": "BaseBdev2", 00:33:09.157 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:09.157 "is_configured": true, 00:33:09.157 "data_offset": 256, 00:33:09.157 "data_size": 7936 00:33:09.157 } 00:33:09.157 ] 00:33:09.157 }' 00:33:09.157 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:09.157 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:33:09.157 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:09.157 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:33:09.157 17:27:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:10.533 [2024-07-23 17:27:05.560634] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:33:10.533 [2024-07-23 17:27:05.560690] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:33:10.533 [2024-07-23 17:27:05.560774] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:10.533 "name": "raid_bdev1", 00:33:10.533 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:10.533 "strip_size_kb": 0, 00:33:10.533 "state": "online", 00:33:10.533 "raid_level": "raid1", 00:33:10.533 "superblock": true, 00:33:10.533 "num_base_bdevs": 2, 00:33:10.533 "num_base_bdevs_discovered": 2, 00:33:10.533 "num_base_bdevs_operational": 2, 00:33:10.533 "base_bdevs_list": [ 00:33:10.533 { 00:33:10.533 "name": "spare", 00:33:10.533 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:10.533 "is_configured": true, 00:33:10.533 "data_offset": 256, 00:33:10.533 "data_size": 7936 00:33:10.533 }, 00:33:10.533 { 00:33:10.533 "name": "BaseBdev2", 00:33:10.533 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:10.533 "is_configured": true, 00:33:10.533 "data_offset": 256, 00:33:10.533 "data_size": 7936 00:33:10.533 } 00:33:10.533 ] 00:33:10.533 }' 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:10.533 17:27:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:10.792 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:10.792 "name": "raid_bdev1", 00:33:10.792 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:10.792 "strip_size_kb": 0, 00:33:10.792 "state": "online", 00:33:10.792 "raid_level": "raid1", 00:33:10.792 "superblock": true, 00:33:10.792 "num_base_bdevs": 2, 00:33:10.792 "num_base_bdevs_discovered": 2, 00:33:10.792 "num_base_bdevs_operational": 2, 00:33:10.792 "base_bdevs_list": [ 00:33:10.792 { 00:33:10.792 "name": "spare", 00:33:10.792 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:10.792 "is_configured": true, 00:33:10.792 "data_offset": 256, 00:33:10.792 "data_size": 7936 00:33:10.792 }, 00:33:10.792 { 00:33:10.792 "name": "BaseBdev2", 00:33:10.792 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:10.792 "is_configured": true, 00:33:10.792 "data_offset": 256, 00:33:10.792 "data_size": 7936 00:33:10.792 } 00:33:10.792 ] 00:33:10.792 }' 00:33:10.792 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:10.792 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:33:10.792 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:11.051 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:11.052 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:11.052 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:11.052 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:11.618 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:11.618 "name": "raid_bdev1", 00:33:11.618 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:11.618 "strip_size_kb": 0, 00:33:11.618 "state": "online", 00:33:11.618 "raid_level": "raid1", 00:33:11.618 "superblock": true, 00:33:11.618 "num_base_bdevs": 2, 00:33:11.618 "num_base_bdevs_discovered": 2, 00:33:11.618 "num_base_bdevs_operational": 2, 00:33:11.618 "base_bdevs_list": [ 00:33:11.618 { 00:33:11.618 "name": "spare", 00:33:11.618 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:11.618 "is_configured": true, 00:33:11.618 "data_offset": 256, 00:33:11.618 "data_size": 7936 00:33:11.618 }, 00:33:11.618 { 00:33:11.618 "name": "BaseBdev2", 00:33:11.618 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:11.618 "is_configured": true, 00:33:11.618 "data_offset": 256, 00:33:11.618 "data_size": 7936 00:33:11.618 } 00:33:11.618 ] 00:33:11.618 }' 00:33:11.618 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:11.618 17:27:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:12.185 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:12.185 [2024-07-23 17:27:07.601535] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:12.185 [2024-07-23 17:27:07.601562] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:12.185 [2024-07-23 17:27:07.601617] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:12.185 [2024-07-23 17:27:07.601672] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:12.185 [2024-07-23 17:27:07.601683] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28b9970 name raid_bdev1, state offline 00:33:12.443 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:12.443 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:12.701 17:27:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:33:12.960 /dev/nbd0 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:12.960 1+0 records in 00:33:12.960 1+0 records out 00:33:12.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00015196 s, 27.0 MB/s 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:12.960 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:33:13.219 /dev/nbd1 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:13.219 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:13.220 1+0 records in 00:33:13.220 1+0 records out 00:33:13.220 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331464 s, 12.4 MB/s 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.220 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:33:13.478 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.479 17:27:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:33:13.737 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:33:13.996 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:33:14.255 [2024-07-23 17:27:09.423252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:33:14.255 [2024-07-23 17:27:09.423295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:14.255 [2024-07-23 17:27:09.423317] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2722250 00:33:14.255 [2024-07-23 17:27:09.423329] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:14.255 [2024-07-23 17:27:09.424795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:14.256 [2024-07-23 17:27:09.424821] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:33:14.256 [2024-07-23 17:27:09.424875] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:33:14.256 [2024-07-23 17:27:09.424909] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:33:14.256 [2024-07-23 17:27:09.425001] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:14.256 spare 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:14.256 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:14.256 [2024-07-23 17:27:09.525309] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28bbbb0 00:33:14.256 [2024-07-23 17:27:09.525327] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:33:14.256 [2024-07-23 17:27:09.525391] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28bb3a0 00:33:14.256 [2024-07-23 17:27:09.525510] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28bbbb0 00:33:14.256 [2024-07-23 17:27:09.525520] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28bbbb0 00:33:14.256 [2024-07-23 17:27:09.525592] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:14.514 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:14.514 "name": "raid_bdev1", 00:33:14.514 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:14.514 "strip_size_kb": 0, 00:33:14.514 "state": "online", 00:33:14.514 "raid_level": "raid1", 00:33:14.514 "superblock": true, 00:33:14.514 "num_base_bdevs": 2, 00:33:14.514 "num_base_bdevs_discovered": 2, 00:33:14.514 "num_base_bdevs_operational": 2, 00:33:14.514 "base_bdevs_list": [ 00:33:14.514 { 00:33:14.514 "name": "spare", 00:33:14.514 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:14.514 "is_configured": true, 00:33:14.514 "data_offset": 256, 00:33:14.514 "data_size": 7936 00:33:14.514 }, 00:33:14.514 { 00:33:14.514 "name": "BaseBdev2", 00:33:14.514 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:14.514 "is_configured": true, 00:33:14.514 "data_offset": 256, 00:33:14.515 "data_size": 7936 00:33:14.515 } 00:33:14.515 ] 00:33:14.515 }' 00:33:14.515 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:14.515 17:27:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:15.082 "name": "raid_bdev1", 00:33:15.082 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:15.082 "strip_size_kb": 0, 00:33:15.082 "state": "online", 00:33:15.082 "raid_level": "raid1", 00:33:15.082 "superblock": true, 00:33:15.082 "num_base_bdevs": 2, 00:33:15.082 "num_base_bdevs_discovered": 2, 00:33:15.082 "num_base_bdevs_operational": 2, 00:33:15.082 "base_bdevs_list": [ 00:33:15.082 { 00:33:15.082 "name": "spare", 00:33:15.082 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:15.082 "is_configured": true, 00:33:15.082 "data_offset": 256, 00:33:15.082 "data_size": 7936 00:33:15.082 }, 00:33:15.082 { 00:33:15.082 "name": "BaseBdev2", 00:33:15.082 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:15.082 "is_configured": true, 00:33:15.082 "data_offset": 256, 00:33:15.082 "data_size": 7936 00:33:15.082 } 00:33:15.082 ] 00:33:15.082 }' 00:33:15.082 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:33:15.340 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:33:15.600 [2024-07-23 17:27:10.903289] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:15.600 17:27:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:15.860 17:27:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:15.860 "name": "raid_bdev1", 00:33:15.860 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:15.860 "strip_size_kb": 0, 00:33:15.860 "state": "online", 00:33:15.860 "raid_level": "raid1", 00:33:15.860 "superblock": true, 00:33:15.860 "num_base_bdevs": 2, 00:33:15.860 "num_base_bdevs_discovered": 1, 00:33:15.860 "num_base_bdevs_operational": 1, 00:33:15.860 "base_bdevs_list": [ 00:33:15.860 { 00:33:15.860 "name": null, 00:33:15.860 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:15.860 "is_configured": false, 00:33:15.860 "data_offset": 256, 00:33:15.860 "data_size": 7936 00:33:15.860 }, 00:33:15.860 { 00:33:15.860 "name": "BaseBdev2", 00:33:15.860 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:15.860 "is_configured": true, 00:33:15.860 "data_offset": 256, 00:33:15.860 "data_size": 7936 00:33:15.860 } 00:33:15.860 ] 00:33:15.860 }' 00:33:15.860 17:27:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:15.860 17:27:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:16.427 17:27:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:33:16.685 [2024-07-23 17:27:11.990185] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:33:16.685 [2024-07-23 17:27:11.990330] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:33:16.685 [2024-07-23 17:27:11.990347] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:33:16.685 [2024-07-23 17:27:11.990374] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:33:16.685 [2024-07-23 17:27:11.992487] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27238e0 00:33:16.685 [2024-07-23 17:27:11.994822] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:33:16.685 17:27:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:33:17.620 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:33:17.620 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:17.620 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:33:17.620 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:33:17.620 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:17.621 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:17.621 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:17.879 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:17.879 "name": "raid_bdev1", 00:33:17.879 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:17.879 "strip_size_kb": 0, 00:33:17.879 "state": "online", 00:33:17.879 "raid_level": "raid1", 00:33:17.879 "superblock": true, 00:33:17.879 "num_base_bdevs": 2, 00:33:17.879 "num_base_bdevs_discovered": 2, 00:33:17.879 "num_base_bdevs_operational": 2, 00:33:17.879 "process": { 00:33:17.879 "type": "rebuild", 00:33:17.879 "target": "spare", 00:33:17.879 "progress": { 00:33:17.879 "blocks": 2816, 00:33:17.879 "percent": 35 00:33:17.879 } 00:33:17.879 }, 00:33:17.879 "base_bdevs_list": [ 00:33:17.879 { 00:33:17.879 "name": "spare", 00:33:17.879 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:17.879 "is_configured": true, 00:33:17.879 "data_offset": 256, 00:33:17.879 "data_size": 7936 00:33:17.879 }, 00:33:17.879 { 00:33:17.879 "name": "BaseBdev2", 00:33:17.879 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:17.879 "is_configured": true, 00:33:17.879 "data_offset": 256, 00:33:17.879 "data_size": 7936 00:33:17.879 } 00:33:17.879 ] 00:33:17.879 }' 00:33:17.879 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:17.879 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:33:17.879 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:17.879 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:33:17.879 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:33:18.137 [2024-07-23 17:27:13.523595] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:18.394 [2024-07-23 17:27:13.607419] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:33:18.394 [2024-07-23 17:27:13.607464] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:18.394 [2024-07-23 17:27:13.607479] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:18.394 [2024-07-23 17:27:13.607488] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:18.394 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:18.653 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:18.653 "name": "raid_bdev1", 00:33:18.653 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:18.653 "strip_size_kb": 0, 00:33:18.653 "state": "online", 00:33:18.653 "raid_level": "raid1", 00:33:18.653 "superblock": true, 00:33:18.653 "num_base_bdevs": 2, 00:33:18.653 "num_base_bdevs_discovered": 1, 00:33:18.653 "num_base_bdevs_operational": 1, 00:33:18.653 "base_bdevs_list": [ 00:33:18.653 { 00:33:18.653 "name": null, 00:33:18.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:18.653 "is_configured": false, 00:33:18.653 "data_offset": 256, 00:33:18.653 "data_size": 7936 00:33:18.653 }, 00:33:18.653 { 00:33:18.653 "name": "BaseBdev2", 00:33:18.653 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:18.653 "is_configured": true, 00:33:18.653 "data_offset": 256, 00:33:18.653 "data_size": 7936 00:33:18.653 } 00:33:18.653 ] 00:33:18.653 }' 00:33:18.653 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:18.653 17:27:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:19.221 17:27:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:33:19.480 [2024-07-23 17:27:14.693415] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:33:19.480 [2024-07-23 17:27:14.693463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:19.480 [2024-07-23 17:27:14.693485] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b9d70 00:33:19.480 [2024-07-23 17:27:14.693499] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:19.480 [2024-07-23 17:27:14.693710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:19.480 [2024-07-23 17:27:14.693726] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:33:19.480 [2024-07-23 17:27:14.693782] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:33:19.480 [2024-07-23 17:27:14.693793] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:33:19.480 [2024-07-23 17:27:14.693804] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:33:19.480 [2024-07-23 17:27:14.693821] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:33:19.480 [2024-07-23 17:27:14.695975] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27238e0 00:33:19.480 [2024-07-23 17:27:14.697403] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:33:19.480 spare 00:33:19.480 17:27:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:20.416 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:20.675 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:20.675 "name": "raid_bdev1", 00:33:20.675 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:20.675 "strip_size_kb": 0, 00:33:20.675 "state": "online", 00:33:20.675 "raid_level": "raid1", 00:33:20.675 "superblock": true, 00:33:20.675 "num_base_bdevs": 2, 00:33:20.675 "num_base_bdevs_discovered": 2, 00:33:20.675 "num_base_bdevs_operational": 2, 00:33:20.675 "process": { 00:33:20.675 "type": "rebuild", 00:33:20.675 "target": "spare", 00:33:20.675 "progress": { 00:33:20.675 "blocks": 3072, 00:33:20.675 "percent": 38 00:33:20.675 } 00:33:20.675 }, 00:33:20.675 "base_bdevs_list": [ 00:33:20.675 { 00:33:20.675 "name": "spare", 00:33:20.675 "uuid": "e7936783-8cfc-5d1c-966a-cd2e5ead7e55", 00:33:20.675 "is_configured": true, 00:33:20.675 "data_offset": 256, 00:33:20.675 "data_size": 7936 00:33:20.675 }, 00:33:20.675 { 00:33:20.675 "name": "BaseBdev2", 00:33:20.675 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:20.675 "is_configured": true, 00:33:20.675 "data_offset": 256, 00:33:20.675 "data_size": 7936 00:33:20.675 } 00:33:20.675 ] 00:33:20.675 }' 00:33:20.675 17:27:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:20.675 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:33:20.675 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:20.675 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:33:20.675 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:33:20.934 [2024-07-23 17:27:16.290529] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:20.934 [2024-07-23 17:27:16.309654] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:33:20.934 [2024-07-23 17:27:16.309695] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:20.934 [2024-07-23 17:27:16.309711] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:33:20.934 [2024-07-23 17:27:16.309718] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:20.934 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:21.194 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:21.194 "name": "raid_bdev1", 00:33:21.194 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:21.194 "strip_size_kb": 0, 00:33:21.194 "state": "online", 00:33:21.194 "raid_level": "raid1", 00:33:21.194 "superblock": true, 00:33:21.194 "num_base_bdevs": 2, 00:33:21.194 "num_base_bdevs_discovered": 1, 00:33:21.194 "num_base_bdevs_operational": 1, 00:33:21.194 "base_bdevs_list": [ 00:33:21.194 { 00:33:21.194 "name": null, 00:33:21.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:21.194 "is_configured": false, 00:33:21.194 "data_offset": 256, 00:33:21.194 "data_size": 7936 00:33:21.194 }, 00:33:21.194 { 00:33:21.194 "name": "BaseBdev2", 00:33:21.194 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:21.194 "is_configured": true, 00:33:21.194 "data_offset": 256, 00:33:21.194 "data_size": 7936 00:33:21.194 } 00:33:21.194 ] 00:33:21.194 }' 00:33:21.194 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:21.194 17:27:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:22.130 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:22.389 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:22.389 "name": "raid_bdev1", 00:33:22.389 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:22.389 "strip_size_kb": 0, 00:33:22.389 "state": "online", 00:33:22.389 "raid_level": "raid1", 00:33:22.389 "superblock": true, 00:33:22.389 "num_base_bdevs": 2, 00:33:22.389 "num_base_bdevs_discovered": 1, 00:33:22.389 "num_base_bdevs_operational": 1, 00:33:22.389 "base_bdevs_list": [ 00:33:22.389 { 00:33:22.389 "name": null, 00:33:22.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:22.389 "is_configured": false, 00:33:22.389 "data_offset": 256, 00:33:22.389 "data_size": 7936 00:33:22.389 }, 00:33:22.389 { 00:33:22.389 "name": "BaseBdev2", 00:33:22.389 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:22.389 "is_configured": true, 00:33:22.389 "data_offset": 256, 00:33:22.389 "data_size": 7936 00:33:22.389 } 00:33:22.389 ] 00:33:22.389 }' 00:33:22.389 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:22.389 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:33:22.389 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:22.389 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:33:22.389 17:27:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:33:22.957 17:27:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:33:23.216 [2024-07-23 17:27:18.534312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:33:23.216 [2024-07-23 17:27:18.534360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:23.216 [2024-07-23 17:27:18.534381] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28bf8d0 00:33:23.216 [2024-07-23 17:27:18.534394] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:23.216 [2024-07-23 17:27:18.534582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:23.216 [2024-07-23 17:27:18.534598] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:33:23.216 [2024-07-23 17:27:18.534642] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:33:23.216 [2024-07-23 17:27:18.534654] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:33:23.216 [2024-07-23 17:27:18.534664] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:33:23.216 BaseBdev1 00:33:23.216 17:27:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:24.152 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:24.420 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:24.420 "name": "raid_bdev1", 00:33:24.420 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:24.420 "strip_size_kb": 0, 00:33:24.420 "state": "online", 00:33:24.420 "raid_level": "raid1", 00:33:24.420 "superblock": true, 00:33:24.420 "num_base_bdevs": 2, 00:33:24.420 "num_base_bdevs_discovered": 1, 00:33:24.420 "num_base_bdevs_operational": 1, 00:33:24.420 "base_bdevs_list": [ 00:33:24.420 { 00:33:24.420 "name": null, 00:33:24.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:24.420 "is_configured": false, 00:33:24.420 "data_offset": 256, 00:33:24.420 "data_size": 7936 00:33:24.420 }, 00:33:24.420 { 00:33:24.420 "name": "BaseBdev2", 00:33:24.420 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:24.420 "is_configured": true, 00:33:24.420 "data_offset": 256, 00:33:24.420 "data_size": 7936 00:33:24.420 } 00:33:24.420 ] 00:33:24.420 }' 00:33:24.420 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:24.420 17:27:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:25.061 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:25.320 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:25.320 "name": "raid_bdev1", 00:33:25.320 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:25.320 "strip_size_kb": 0, 00:33:25.320 "state": "online", 00:33:25.320 "raid_level": "raid1", 00:33:25.320 "superblock": true, 00:33:25.320 "num_base_bdevs": 2, 00:33:25.320 "num_base_bdevs_discovered": 1, 00:33:25.320 "num_base_bdevs_operational": 1, 00:33:25.320 "base_bdevs_list": [ 00:33:25.320 { 00:33:25.320 "name": null, 00:33:25.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:25.320 "is_configured": false, 00:33:25.321 "data_offset": 256, 00:33:25.321 "data_size": 7936 00:33:25.321 }, 00:33:25.321 { 00:33:25.321 "name": "BaseBdev2", 00:33:25.321 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:25.321 "is_configured": true, 00:33:25.321 "data_offset": 256, 00:33:25.321 "data_size": 7936 00:33:25.321 } 00:33:25.321 ] 00:33:25.321 }' 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:33:25.321 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:33:25.579 [2024-07-23 17:27:20.956758] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:25.580 [2024-07-23 17:27:20.956885] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:33:25.580 [2024-07-23 17:27:20.956910] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:33:25.580 request: 00:33:25.580 { 00:33:25.580 "base_bdev": "BaseBdev1", 00:33:25.580 "raid_bdev": "raid_bdev1", 00:33:25.580 "method": "bdev_raid_add_base_bdev", 00:33:25.580 "req_id": 1 00:33:25.580 } 00:33:25.580 Got JSON-RPC error response 00:33:25.580 response: 00:33:25.580 { 00:33:25.580 "code": -22, 00:33:25.580 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:33:25.580 } 00:33:25.580 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:33:25.580 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:33:25.580 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:33:25.580 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:33:25.580 17:27:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:26.957 17:27:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:26.957 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:26.957 "name": "raid_bdev1", 00:33:26.957 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:26.957 "strip_size_kb": 0, 00:33:26.957 "state": "online", 00:33:26.957 "raid_level": "raid1", 00:33:26.957 "superblock": true, 00:33:26.957 "num_base_bdevs": 2, 00:33:26.957 "num_base_bdevs_discovered": 1, 00:33:26.957 "num_base_bdevs_operational": 1, 00:33:26.957 "base_bdevs_list": [ 00:33:26.957 { 00:33:26.957 "name": null, 00:33:26.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:26.957 "is_configured": false, 00:33:26.957 "data_offset": 256, 00:33:26.957 "data_size": 7936 00:33:26.957 }, 00:33:26.957 { 00:33:26.957 "name": "BaseBdev2", 00:33:26.957 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:26.957 "is_configured": true, 00:33:26.957 "data_offset": 256, 00:33:26.957 "data_size": 7936 00:33:26.957 } 00:33:26.957 ] 00:33:26.957 }' 00:33:26.957 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:26.957 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:27.525 17:27:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:27.783 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:33:27.783 "name": "raid_bdev1", 00:33:27.784 "uuid": "bf6dbf48-d2de-4650-b9f1-8b8550493782", 00:33:27.784 "strip_size_kb": 0, 00:33:27.784 "state": "online", 00:33:27.784 "raid_level": "raid1", 00:33:27.784 "superblock": true, 00:33:27.784 "num_base_bdevs": 2, 00:33:27.784 "num_base_bdevs_discovered": 1, 00:33:27.784 "num_base_bdevs_operational": 1, 00:33:27.784 "base_bdevs_list": [ 00:33:27.784 { 00:33:27.784 "name": null, 00:33:27.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:27.784 "is_configured": false, 00:33:27.784 "data_offset": 256, 00:33:27.784 "data_size": 7936 00:33:27.784 }, 00:33:27.784 { 00:33:27.784 "name": "BaseBdev2", 00:33:27.784 "uuid": "3467ba31-2839-57af-b2c9-ad7e7988fb5e", 00:33:27.784 "is_configured": true, 00:33:27.784 "data_offset": 256, 00:33:27.784 "data_size": 7936 00:33:27.784 } 00:33:27.784 ] 00:33:27.784 }' 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 71738 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 71738 ']' 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 71738 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:27.784 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 71738 00:33:28.042 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:28.042 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:28.042 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 71738' 00:33:28.042 killing process with pid 71738 00:33:28.042 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 71738 00:33:28.042 Received shutdown signal, test time was about 60.000000 seconds 00:33:28.042 00:33:28.042 Latency(us) 00:33:28.042 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:28.042 =================================================================================================================== 00:33:28.042 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:33:28.042 [2024-07-23 17:27:23.223883] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:28.042 [2024-07-23 17:27:23.223976] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:28.043 [2024-07-23 17:27:23.224020] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:28.043 [2024-07-23 17:27:23.224032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28bbbb0 name raid_bdev1, state offline 00:33:28.043 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 71738 00:33:28.043 [2024-07-23 17:27:23.262752] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:28.301 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:33:28.301 00:33:28.301 real 0m31.189s 00:33:28.301 user 0m49.176s 00:33:28.302 sys 0m5.299s 00:33:28.302 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:28.302 17:27:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:33:28.302 ************************************ 00:33:28.302 END TEST raid_rebuild_test_sb_md_separate 00:33:28.302 ************************************ 00:33:28.302 17:27:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:33:28.302 17:27:23 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:33:28.302 17:27:23 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:33:28.302 17:27:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:28.302 17:27:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:28.302 17:27:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:28.302 ************************************ 00:33:28.302 START TEST raid_state_function_test_sb_md_interleaved 00:33:28.302 ************************************ 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=76684 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 76684' 00:33:28.302 Process raid pid: 76684 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 76684 /var/tmp/spdk-raid.sock 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 76684 ']' 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:28.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:28.302 17:27:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:28.302 [2024-07-23 17:27:23.622563] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:33:28.302 [2024-07-23 17:27:23.622633] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:28.562 [2024-07-23 17:27:23.745186] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:28.562 [2024-07-23 17:27:23.795638] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:28.562 [2024-07-23 17:27:23.859005] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:28.562 [2024-07-23 17:27:23.859040] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:33:29.499 [2024-07-23 17:27:24.780624] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:29.499 [2024-07-23 17:27:24.780663] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:29.499 [2024-07-23 17:27:24.780674] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:29.499 [2024-07-23 17:27:24.780686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:29.499 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:29.500 17:27:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:29.759 17:27:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:29.759 "name": "Existed_Raid", 00:33:29.759 "uuid": "f5e94409-c076-4771-bff6-b5a00facf914", 00:33:29.759 "strip_size_kb": 0, 00:33:29.759 "state": "configuring", 00:33:29.759 "raid_level": "raid1", 00:33:29.759 "superblock": true, 00:33:29.759 "num_base_bdevs": 2, 00:33:29.759 "num_base_bdevs_discovered": 0, 00:33:29.759 "num_base_bdevs_operational": 2, 00:33:29.759 "base_bdevs_list": [ 00:33:29.759 { 00:33:29.759 "name": "BaseBdev1", 00:33:29.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:29.759 "is_configured": false, 00:33:29.759 "data_offset": 0, 00:33:29.759 "data_size": 0 00:33:29.759 }, 00:33:29.759 { 00:33:29.759 "name": "BaseBdev2", 00:33:29.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:29.759 "is_configured": false, 00:33:29.759 "data_offset": 0, 00:33:29.759 "data_size": 0 00:33:29.759 } 00:33:29.759 ] 00:33:29.759 }' 00:33:29.759 17:27:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:29.759 17:27:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:30.326 17:27:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:30.584 [2024-07-23 17:27:25.883397] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:30.584 [2024-07-23 17:27:25.883427] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9fa3f0 name Existed_Raid, state configuring 00:33:30.584 17:27:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:33:30.843 [2024-07-23 17:27:26.132068] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:33:30.843 [2024-07-23 17:27:26.132095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:33:30.843 [2024-07-23 17:27:26.132105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:30.843 [2024-07-23 17:27:26.132117] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:30.843 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:33:31.102 [2024-07-23 17:27:26.390646] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:31.102 BaseBdev1 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:31.102 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:31.360 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:33:31.620 [ 00:33:31.620 { 00:33:31.620 "name": "BaseBdev1", 00:33:31.620 "aliases": [ 00:33:31.620 "bc480cea-cf54-427c-8e38-c5686ec0d8f9" 00:33:31.620 ], 00:33:31.620 "product_name": "Malloc disk", 00:33:31.620 "block_size": 4128, 00:33:31.620 "num_blocks": 8192, 00:33:31.620 "uuid": "bc480cea-cf54-427c-8e38-c5686ec0d8f9", 00:33:31.620 "md_size": 32, 00:33:31.620 "md_interleave": true, 00:33:31.620 "dif_type": 0, 00:33:31.620 "assigned_rate_limits": { 00:33:31.620 "rw_ios_per_sec": 0, 00:33:31.620 "rw_mbytes_per_sec": 0, 00:33:31.620 "r_mbytes_per_sec": 0, 00:33:31.620 "w_mbytes_per_sec": 0 00:33:31.620 }, 00:33:31.620 "claimed": true, 00:33:31.620 "claim_type": "exclusive_write", 00:33:31.620 "zoned": false, 00:33:31.620 "supported_io_types": { 00:33:31.620 "read": true, 00:33:31.620 "write": true, 00:33:31.620 "unmap": true, 00:33:31.620 "flush": true, 00:33:31.620 "reset": true, 00:33:31.620 "nvme_admin": false, 00:33:31.620 "nvme_io": false, 00:33:31.620 "nvme_io_md": false, 00:33:31.620 "write_zeroes": true, 00:33:31.620 "zcopy": true, 00:33:31.620 "get_zone_info": false, 00:33:31.620 "zone_management": false, 00:33:31.620 "zone_append": false, 00:33:31.620 "compare": false, 00:33:31.620 "compare_and_write": false, 00:33:31.620 "abort": true, 00:33:31.620 "seek_hole": false, 00:33:31.620 "seek_data": false, 00:33:31.620 "copy": true, 00:33:31.620 "nvme_iov_md": false 00:33:31.620 }, 00:33:31.620 "memory_domains": [ 00:33:31.620 { 00:33:31.620 "dma_device_id": "system", 00:33:31.620 "dma_device_type": 1 00:33:31.620 }, 00:33:31.620 { 00:33:31.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:31.620 "dma_device_type": 2 00:33:31.620 } 00:33:31.620 ], 00:33:31.620 "driver_specific": {} 00:33:31.620 } 00:33:31.620 ] 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:31.620 17:27:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:31.879 17:27:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:31.879 "name": "Existed_Raid", 00:33:31.879 "uuid": "bdd1ffc1-dbc5-4833-9894-5e23858fe133", 00:33:31.879 "strip_size_kb": 0, 00:33:31.879 "state": "configuring", 00:33:31.879 "raid_level": "raid1", 00:33:31.879 "superblock": true, 00:33:31.879 "num_base_bdevs": 2, 00:33:31.879 "num_base_bdevs_discovered": 1, 00:33:31.879 "num_base_bdevs_operational": 2, 00:33:31.879 "base_bdevs_list": [ 00:33:31.879 { 00:33:31.879 "name": "BaseBdev1", 00:33:31.879 "uuid": "bc480cea-cf54-427c-8e38-c5686ec0d8f9", 00:33:31.879 "is_configured": true, 00:33:31.879 "data_offset": 256, 00:33:31.879 "data_size": 7936 00:33:31.879 }, 00:33:31.879 { 00:33:31.879 "name": "BaseBdev2", 00:33:31.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:31.879 "is_configured": false, 00:33:31.879 "data_offset": 0, 00:33:31.879 "data_size": 0 00:33:31.879 } 00:33:31.879 ] 00:33:31.879 }' 00:33:31.879 17:27:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:31.879 17:27:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:32.447 17:27:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:33:32.705 [2024-07-23 17:27:27.982898] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:33:32.705 [2024-07-23 17:27:27.982937] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f9d20 name Existed_Raid, state configuring 00:33:32.705 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:33:32.964 [2024-07-23 17:27:28.231582] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:33:32.964 [2024-07-23 17:27:28.233035] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:33:32.964 [2024-07-23 17:27:28.233068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:32.964 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:33.223 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:33.223 "name": "Existed_Raid", 00:33:33.223 "uuid": "256ea7f3-ddd1-4661-a4e2-bdfde1c34d31", 00:33:33.223 "strip_size_kb": 0, 00:33:33.223 "state": "configuring", 00:33:33.223 "raid_level": "raid1", 00:33:33.223 "superblock": true, 00:33:33.223 "num_base_bdevs": 2, 00:33:33.223 "num_base_bdevs_discovered": 1, 00:33:33.223 "num_base_bdevs_operational": 2, 00:33:33.223 "base_bdevs_list": [ 00:33:33.223 { 00:33:33.223 "name": "BaseBdev1", 00:33:33.223 "uuid": "bc480cea-cf54-427c-8e38-c5686ec0d8f9", 00:33:33.223 "is_configured": true, 00:33:33.223 "data_offset": 256, 00:33:33.223 "data_size": 7936 00:33:33.223 }, 00:33:33.223 { 00:33:33.223 "name": "BaseBdev2", 00:33:33.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:33.223 "is_configured": false, 00:33:33.223 "data_offset": 0, 00:33:33.223 "data_size": 0 00:33:33.223 } 00:33:33.223 ] 00:33:33.223 }' 00:33:33.223 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:33.223 17:27:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:33.790 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:33:34.049 [2024-07-23 17:27:29.327273] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:33:34.049 [2024-07-23 17:27:29.327411] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xb95f30 00:33:34.049 [2024-07-23 17:27:29.327424] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:33:34.049 [2024-07-23 17:27:29.327484] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb97440 00:33:34.049 [2024-07-23 17:27:29.327561] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb95f30 00:33:34.049 [2024-07-23 17:27:29.327571] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb95f30 00:33:34.049 [2024-07-23 17:27:29.327624] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:34.049 BaseBdev2 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:34.049 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:33:34.308 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:33:34.567 [ 00:33:34.567 { 00:33:34.567 "name": "BaseBdev2", 00:33:34.567 "aliases": [ 00:33:34.567 "848a2a15-1f76-490c-bd06-53f0f0f65fbc" 00:33:34.567 ], 00:33:34.567 "product_name": "Malloc disk", 00:33:34.567 "block_size": 4128, 00:33:34.567 "num_blocks": 8192, 00:33:34.567 "uuid": "848a2a15-1f76-490c-bd06-53f0f0f65fbc", 00:33:34.567 "md_size": 32, 00:33:34.567 "md_interleave": true, 00:33:34.567 "dif_type": 0, 00:33:34.567 "assigned_rate_limits": { 00:33:34.567 "rw_ios_per_sec": 0, 00:33:34.567 "rw_mbytes_per_sec": 0, 00:33:34.567 "r_mbytes_per_sec": 0, 00:33:34.567 "w_mbytes_per_sec": 0 00:33:34.567 }, 00:33:34.567 "claimed": true, 00:33:34.567 "claim_type": "exclusive_write", 00:33:34.567 "zoned": false, 00:33:34.567 "supported_io_types": { 00:33:34.567 "read": true, 00:33:34.567 "write": true, 00:33:34.567 "unmap": true, 00:33:34.567 "flush": true, 00:33:34.567 "reset": true, 00:33:34.567 "nvme_admin": false, 00:33:34.567 "nvme_io": false, 00:33:34.567 "nvme_io_md": false, 00:33:34.567 "write_zeroes": true, 00:33:34.567 "zcopy": true, 00:33:34.567 "get_zone_info": false, 00:33:34.567 "zone_management": false, 00:33:34.567 "zone_append": false, 00:33:34.567 "compare": false, 00:33:34.567 "compare_and_write": false, 00:33:34.567 "abort": true, 00:33:34.567 "seek_hole": false, 00:33:34.567 "seek_data": false, 00:33:34.567 "copy": true, 00:33:34.567 "nvme_iov_md": false 00:33:34.567 }, 00:33:34.567 "memory_domains": [ 00:33:34.567 { 00:33:34.567 "dma_device_id": "system", 00:33:34.567 "dma_device_type": 1 00:33:34.567 }, 00:33:34.567 { 00:33:34.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:34.568 "dma_device_type": 2 00:33:34.568 } 00:33:34.568 ], 00:33:34.568 "driver_specific": {} 00:33:34.568 } 00:33:34.568 ] 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:34.568 17:27:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:35.134 17:27:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:35.134 "name": "Existed_Raid", 00:33:35.134 "uuid": "256ea7f3-ddd1-4661-a4e2-bdfde1c34d31", 00:33:35.134 "strip_size_kb": 0, 00:33:35.134 "state": "online", 00:33:35.134 "raid_level": "raid1", 00:33:35.134 "superblock": true, 00:33:35.134 "num_base_bdevs": 2, 00:33:35.134 "num_base_bdevs_discovered": 2, 00:33:35.134 "num_base_bdevs_operational": 2, 00:33:35.134 "base_bdevs_list": [ 00:33:35.134 { 00:33:35.134 "name": "BaseBdev1", 00:33:35.134 "uuid": "bc480cea-cf54-427c-8e38-c5686ec0d8f9", 00:33:35.134 "is_configured": true, 00:33:35.134 "data_offset": 256, 00:33:35.134 "data_size": 7936 00:33:35.134 }, 00:33:35.134 { 00:33:35.134 "name": "BaseBdev2", 00:33:35.134 "uuid": "848a2a15-1f76-490c-bd06-53f0f0f65fbc", 00:33:35.134 "is_configured": true, 00:33:35.134 "data_offset": 256, 00:33:35.134 "data_size": 7936 00:33:35.134 } 00:33:35.134 ] 00:33:35.134 }' 00:33:35.134 17:27:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:35.134 17:27:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:33:36.069 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:36.637 [2024-07-23 17:27:31.798221] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:36.637 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:36.637 "name": "Existed_Raid", 00:33:36.637 "aliases": [ 00:33:36.637 "256ea7f3-ddd1-4661-a4e2-bdfde1c34d31" 00:33:36.637 ], 00:33:36.637 "product_name": "Raid Volume", 00:33:36.637 "block_size": 4128, 00:33:36.637 "num_blocks": 7936, 00:33:36.637 "uuid": "256ea7f3-ddd1-4661-a4e2-bdfde1c34d31", 00:33:36.637 "md_size": 32, 00:33:36.637 "md_interleave": true, 00:33:36.637 "dif_type": 0, 00:33:36.637 "assigned_rate_limits": { 00:33:36.637 "rw_ios_per_sec": 0, 00:33:36.637 "rw_mbytes_per_sec": 0, 00:33:36.637 "r_mbytes_per_sec": 0, 00:33:36.637 "w_mbytes_per_sec": 0 00:33:36.637 }, 00:33:36.637 "claimed": false, 00:33:36.637 "zoned": false, 00:33:36.637 "supported_io_types": { 00:33:36.637 "read": true, 00:33:36.637 "write": true, 00:33:36.637 "unmap": false, 00:33:36.637 "flush": false, 00:33:36.637 "reset": true, 00:33:36.637 "nvme_admin": false, 00:33:36.637 "nvme_io": false, 00:33:36.637 "nvme_io_md": false, 00:33:36.637 "write_zeroes": true, 00:33:36.637 "zcopy": false, 00:33:36.637 "get_zone_info": false, 00:33:36.637 "zone_management": false, 00:33:36.637 "zone_append": false, 00:33:36.637 "compare": false, 00:33:36.637 "compare_and_write": false, 00:33:36.637 "abort": false, 00:33:36.637 "seek_hole": false, 00:33:36.637 "seek_data": false, 00:33:36.637 "copy": false, 00:33:36.637 "nvme_iov_md": false 00:33:36.637 }, 00:33:36.637 "memory_domains": [ 00:33:36.637 { 00:33:36.637 "dma_device_id": "system", 00:33:36.637 "dma_device_type": 1 00:33:36.637 }, 00:33:36.637 { 00:33:36.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:36.637 "dma_device_type": 2 00:33:36.637 }, 00:33:36.637 { 00:33:36.637 "dma_device_id": "system", 00:33:36.637 "dma_device_type": 1 00:33:36.637 }, 00:33:36.637 { 00:33:36.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:36.637 "dma_device_type": 2 00:33:36.637 } 00:33:36.637 ], 00:33:36.637 "driver_specific": { 00:33:36.637 "raid": { 00:33:36.637 "uuid": "256ea7f3-ddd1-4661-a4e2-bdfde1c34d31", 00:33:36.637 "strip_size_kb": 0, 00:33:36.637 "state": "online", 00:33:36.637 "raid_level": "raid1", 00:33:36.637 "superblock": true, 00:33:36.637 "num_base_bdevs": 2, 00:33:36.637 "num_base_bdevs_discovered": 2, 00:33:36.637 "num_base_bdevs_operational": 2, 00:33:36.637 "base_bdevs_list": [ 00:33:36.637 { 00:33:36.637 "name": "BaseBdev1", 00:33:36.637 "uuid": "bc480cea-cf54-427c-8e38-c5686ec0d8f9", 00:33:36.637 "is_configured": true, 00:33:36.637 "data_offset": 256, 00:33:36.637 "data_size": 7936 00:33:36.637 }, 00:33:36.637 { 00:33:36.637 "name": "BaseBdev2", 00:33:36.637 "uuid": "848a2a15-1f76-490c-bd06-53f0f0f65fbc", 00:33:36.637 "is_configured": true, 00:33:36.637 "data_offset": 256, 00:33:36.637 "data_size": 7936 00:33:36.637 } 00:33:36.637 ] 00:33:36.637 } 00:33:36.637 } 00:33:36.637 }' 00:33:36.637 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:36.637 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:33:36.637 BaseBdev2' 00:33:36.637 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:36.637 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:33:36.637 17:27:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:37.205 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:37.206 "name": "BaseBdev1", 00:33:37.206 "aliases": [ 00:33:37.206 "bc480cea-cf54-427c-8e38-c5686ec0d8f9" 00:33:37.206 ], 00:33:37.206 "product_name": "Malloc disk", 00:33:37.206 "block_size": 4128, 00:33:37.206 "num_blocks": 8192, 00:33:37.206 "uuid": "bc480cea-cf54-427c-8e38-c5686ec0d8f9", 00:33:37.206 "md_size": 32, 00:33:37.206 "md_interleave": true, 00:33:37.206 "dif_type": 0, 00:33:37.206 "assigned_rate_limits": { 00:33:37.206 "rw_ios_per_sec": 0, 00:33:37.206 "rw_mbytes_per_sec": 0, 00:33:37.206 "r_mbytes_per_sec": 0, 00:33:37.206 "w_mbytes_per_sec": 0 00:33:37.206 }, 00:33:37.206 "claimed": true, 00:33:37.206 "claim_type": "exclusive_write", 00:33:37.206 "zoned": false, 00:33:37.206 "supported_io_types": { 00:33:37.206 "read": true, 00:33:37.206 "write": true, 00:33:37.206 "unmap": true, 00:33:37.206 "flush": true, 00:33:37.206 "reset": true, 00:33:37.206 "nvme_admin": false, 00:33:37.206 "nvme_io": false, 00:33:37.206 "nvme_io_md": false, 00:33:37.206 "write_zeroes": true, 00:33:37.206 "zcopy": true, 00:33:37.206 "get_zone_info": false, 00:33:37.206 "zone_management": false, 00:33:37.206 "zone_append": false, 00:33:37.206 "compare": false, 00:33:37.206 "compare_and_write": false, 00:33:37.206 "abort": true, 00:33:37.206 "seek_hole": false, 00:33:37.206 "seek_data": false, 00:33:37.206 "copy": true, 00:33:37.206 "nvme_iov_md": false 00:33:37.206 }, 00:33:37.206 "memory_domains": [ 00:33:37.206 { 00:33:37.206 "dma_device_id": "system", 00:33:37.206 "dma_device_type": 1 00:33:37.206 }, 00:33:37.206 { 00:33:37.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:37.206 "dma_device_type": 2 00:33:37.206 } 00:33:37.206 ], 00:33:37.206 "driver_specific": {} 00:33:37.206 }' 00:33:37.206 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:37.206 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:37.206 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:33:37.206 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:37.206 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:37.464 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:33:37.464 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:37.464 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:37.464 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:33:37.464 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:37.464 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:37.723 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:33:37.723 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:37.723 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:33:37.723 17:27:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:38.290 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:38.290 "name": "BaseBdev2", 00:33:38.290 "aliases": [ 00:33:38.290 "848a2a15-1f76-490c-bd06-53f0f0f65fbc" 00:33:38.290 ], 00:33:38.290 "product_name": "Malloc disk", 00:33:38.290 "block_size": 4128, 00:33:38.290 "num_blocks": 8192, 00:33:38.290 "uuid": "848a2a15-1f76-490c-bd06-53f0f0f65fbc", 00:33:38.290 "md_size": 32, 00:33:38.290 "md_interleave": true, 00:33:38.290 "dif_type": 0, 00:33:38.290 "assigned_rate_limits": { 00:33:38.290 "rw_ios_per_sec": 0, 00:33:38.290 "rw_mbytes_per_sec": 0, 00:33:38.290 "r_mbytes_per_sec": 0, 00:33:38.290 "w_mbytes_per_sec": 0 00:33:38.290 }, 00:33:38.290 "claimed": true, 00:33:38.290 "claim_type": "exclusive_write", 00:33:38.290 "zoned": false, 00:33:38.290 "supported_io_types": { 00:33:38.290 "read": true, 00:33:38.290 "write": true, 00:33:38.291 "unmap": true, 00:33:38.291 "flush": true, 00:33:38.291 "reset": true, 00:33:38.291 "nvme_admin": false, 00:33:38.291 "nvme_io": false, 00:33:38.291 "nvme_io_md": false, 00:33:38.291 "write_zeroes": true, 00:33:38.291 "zcopy": true, 00:33:38.291 "get_zone_info": false, 00:33:38.291 "zone_management": false, 00:33:38.291 "zone_append": false, 00:33:38.291 "compare": false, 00:33:38.291 "compare_and_write": false, 00:33:38.291 "abort": true, 00:33:38.291 "seek_hole": false, 00:33:38.291 "seek_data": false, 00:33:38.291 "copy": true, 00:33:38.291 "nvme_iov_md": false 00:33:38.291 }, 00:33:38.291 "memory_domains": [ 00:33:38.291 { 00:33:38.291 "dma_device_id": "system", 00:33:38.291 "dma_device_type": 1 00:33:38.291 }, 00:33:38.291 { 00:33:38.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:38.291 "dma_device_type": 2 00:33:38.291 } 00:33:38.291 ], 00:33:38.291 "driver_specific": {} 00:33:38.291 }' 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:33:38.291 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:38.549 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:38.549 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:33:38.549 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:38.549 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:38.809 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:33:38.809 17:27:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:33:39.068 [2024-07-23 17:27:34.264519] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:39.068 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:33:39.636 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:39.636 "name": "Existed_Raid", 00:33:39.636 "uuid": "256ea7f3-ddd1-4661-a4e2-bdfde1c34d31", 00:33:39.636 "strip_size_kb": 0, 00:33:39.636 "state": "online", 00:33:39.636 "raid_level": "raid1", 00:33:39.636 "superblock": true, 00:33:39.636 "num_base_bdevs": 2, 00:33:39.636 "num_base_bdevs_discovered": 1, 00:33:39.636 "num_base_bdevs_operational": 1, 00:33:39.636 "base_bdevs_list": [ 00:33:39.636 { 00:33:39.636 "name": null, 00:33:39.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:39.636 "is_configured": false, 00:33:39.636 "data_offset": 256, 00:33:39.636 "data_size": 7936 00:33:39.636 }, 00:33:39.636 { 00:33:39.636 "name": "BaseBdev2", 00:33:39.636 "uuid": "848a2a15-1f76-490c-bd06-53f0f0f65fbc", 00:33:39.636 "is_configured": true, 00:33:39.636 "data_offset": 256, 00:33:39.636 "data_size": 7936 00:33:39.636 } 00:33:39.636 ] 00:33:39.636 }' 00:33:39.636 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:39.636 17:27:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:40.204 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:33:40.204 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:40.204 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:40.204 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:33:40.464 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:33:40.464 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:33:40.464 17:27:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:33:41.032 [2024-07-23 17:27:36.167549] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:33:41.032 [2024-07-23 17:27:36.167636] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:41.032 [2024-07-23 17:27:36.180749] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:41.032 [2024-07-23 17:27:36.180786] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:41.032 [2024-07-23 17:27:36.180804] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb95f30 name Existed_Raid, state offline 00:33:41.032 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:33:41.032 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:33:41.032 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:41.032 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 76684 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 76684 ']' 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 76684 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 76684 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:41.599 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:41.600 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 76684' 00:33:41.600 killing process with pid 76684 00:33:41.600 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 76684 00:33:41.600 [2024-07-23 17:27:36.777446] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:33:41.600 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 76684 00:33:41.600 [2024-07-23 17:27:36.778425] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:33:41.600 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:33:41.600 00:33:41.600 real 0m13.427s 00:33:41.600 user 0m24.157s 00:33:41.600 sys 0m2.294s 00:33:41.600 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:41.600 17:27:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:41.600 ************************************ 00:33:41.600 END TEST raid_state_function_test_sb_md_interleaved 00:33:41.600 ************************************ 00:33:41.859 17:27:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:33:41.859 17:27:37 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:33:41.859 17:27:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:33:41.859 17:27:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:41.859 17:27:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:33:41.859 ************************************ 00:33:41.859 START TEST raid_superblock_test_md_interleaved 00:33:41.859 ************************************ 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=78570 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 78570 /var/tmp/spdk-raid.sock 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 78570 ']' 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:33:41.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:41.859 17:27:37 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:41.859 [2024-07-23 17:27:37.134408] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:33:41.859 [2024-07-23 17:27:37.134473] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid78570 ] 00:33:41.859 [2024-07-23 17:27:37.266660] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:42.147 [2024-07-23 17:27:37.319404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:42.147 [2024-07-23 17:27:37.384852] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:42.147 [2024-07-23 17:27:37.384903] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:33:42.714 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:42.714 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:42.715 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:33:43.282 malloc1 00:33:43.282 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:43.541 [2024-07-23 17:27:38.924648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:43.541 [2024-07-23 17:27:38.924700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:43.541 [2024-07-23 17:27:38.924721] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23192a0 00:33:43.541 [2024-07-23 17:27:38.924734] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:43.542 [2024-07-23 17:27:38.926221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:43.542 [2024-07-23 17:27:38.926249] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:43.542 pt1 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:33:43.542 17:27:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:33:44.110 malloc2 00:33:44.110 17:27:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:44.370 [2024-07-23 17:27:39.743756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:44.370 [2024-07-23 17:27:39.743805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:44.370 [2024-07-23 17:27:39.743824] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21fe2c0 00:33:44.370 [2024-07-23 17:27:39.743837] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:44.370 [2024-07-23 17:27:39.745275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:44.370 [2024-07-23 17:27:39.745303] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:44.370 pt2 00:33:44.370 17:27:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:33:44.370 17:27:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:33:44.370 17:27:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:33:44.938 [2024-07-23 17:27:40.249103] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:44.938 [2024-07-23 17:27:40.250591] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:44.938 [2024-07-23 17:27:40.250746] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2313850 00:33:44.938 [2024-07-23 17:27:40.250759] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:33:44.938 [2024-07-23 17:27:40.250831] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217bf00 00:33:44.938 [2024-07-23 17:27:40.250927] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2313850 00:33:44.938 [2024-07-23 17:27:40.250937] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2313850 00:33:44.938 [2024-07-23 17:27:40.250999] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:44.938 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:45.506 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:45.506 "name": "raid_bdev1", 00:33:45.506 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:45.506 "strip_size_kb": 0, 00:33:45.506 "state": "online", 00:33:45.506 "raid_level": "raid1", 00:33:45.506 "superblock": true, 00:33:45.506 "num_base_bdevs": 2, 00:33:45.506 "num_base_bdevs_discovered": 2, 00:33:45.506 "num_base_bdevs_operational": 2, 00:33:45.506 "base_bdevs_list": [ 00:33:45.506 { 00:33:45.506 "name": "pt1", 00:33:45.506 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:45.506 "is_configured": true, 00:33:45.506 "data_offset": 256, 00:33:45.506 "data_size": 7936 00:33:45.506 }, 00:33:45.506 { 00:33:45.506 "name": "pt2", 00:33:45.506 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:45.506 "is_configured": true, 00:33:45.506 "data_offset": 256, 00:33:45.506 "data_size": 7936 00:33:45.506 } 00:33:45.506 ] 00:33:45.506 }' 00:33:45.506 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:45.506 17:27:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:46.444 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:46.703 [2024-07-23 17:27:41.933790] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:46.703 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:46.703 "name": "raid_bdev1", 00:33:46.703 "aliases": [ 00:33:46.703 "90df058d-086a-4f8a-93ac-72bab19468fb" 00:33:46.703 ], 00:33:46.703 "product_name": "Raid Volume", 00:33:46.703 "block_size": 4128, 00:33:46.703 "num_blocks": 7936, 00:33:46.703 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:46.703 "md_size": 32, 00:33:46.703 "md_interleave": true, 00:33:46.703 "dif_type": 0, 00:33:46.703 "assigned_rate_limits": { 00:33:46.703 "rw_ios_per_sec": 0, 00:33:46.703 "rw_mbytes_per_sec": 0, 00:33:46.703 "r_mbytes_per_sec": 0, 00:33:46.703 "w_mbytes_per_sec": 0 00:33:46.703 }, 00:33:46.703 "claimed": false, 00:33:46.703 "zoned": false, 00:33:46.703 "supported_io_types": { 00:33:46.703 "read": true, 00:33:46.703 "write": true, 00:33:46.703 "unmap": false, 00:33:46.703 "flush": false, 00:33:46.703 "reset": true, 00:33:46.703 "nvme_admin": false, 00:33:46.703 "nvme_io": false, 00:33:46.703 "nvme_io_md": false, 00:33:46.703 "write_zeroes": true, 00:33:46.703 "zcopy": false, 00:33:46.703 "get_zone_info": false, 00:33:46.703 "zone_management": false, 00:33:46.703 "zone_append": false, 00:33:46.703 "compare": false, 00:33:46.703 "compare_and_write": false, 00:33:46.703 "abort": false, 00:33:46.703 "seek_hole": false, 00:33:46.703 "seek_data": false, 00:33:46.703 "copy": false, 00:33:46.703 "nvme_iov_md": false 00:33:46.703 }, 00:33:46.703 "memory_domains": [ 00:33:46.703 { 00:33:46.703 "dma_device_id": "system", 00:33:46.703 "dma_device_type": 1 00:33:46.703 }, 00:33:46.703 { 00:33:46.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:46.703 "dma_device_type": 2 00:33:46.703 }, 00:33:46.703 { 00:33:46.703 "dma_device_id": "system", 00:33:46.703 "dma_device_type": 1 00:33:46.703 }, 00:33:46.703 { 00:33:46.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:46.703 "dma_device_type": 2 00:33:46.703 } 00:33:46.703 ], 00:33:46.703 "driver_specific": { 00:33:46.703 "raid": { 00:33:46.703 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:46.703 "strip_size_kb": 0, 00:33:46.703 "state": "online", 00:33:46.703 "raid_level": "raid1", 00:33:46.703 "superblock": true, 00:33:46.703 "num_base_bdevs": 2, 00:33:46.703 "num_base_bdevs_discovered": 2, 00:33:46.703 "num_base_bdevs_operational": 2, 00:33:46.703 "base_bdevs_list": [ 00:33:46.703 { 00:33:46.703 "name": "pt1", 00:33:46.703 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:46.703 "is_configured": true, 00:33:46.703 "data_offset": 256, 00:33:46.703 "data_size": 7936 00:33:46.703 }, 00:33:46.703 { 00:33:46.703 "name": "pt2", 00:33:46.703 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:46.703 "is_configured": true, 00:33:46.703 "data_offset": 256, 00:33:46.703 "data_size": 7936 00:33:46.703 } 00:33:46.703 ] 00:33:46.703 } 00:33:46.703 } 00:33:46.703 }' 00:33:46.703 17:27:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:46.703 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:33:46.703 pt2' 00:33:46.703 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:46.703 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:33:46.703 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:47.268 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:47.268 "name": "pt1", 00:33:47.268 "aliases": [ 00:33:47.268 "00000000-0000-0000-0000-000000000001" 00:33:47.268 ], 00:33:47.268 "product_name": "passthru", 00:33:47.268 "block_size": 4128, 00:33:47.268 "num_blocks": 8192, 00:33:47.268 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:47.268 "md_size": 32, 00:33:47.268 "md_interleave": true, 00:33:47.268 "dif_type": 0, 00:33:47.268 "assigned_rate_limits": { 00:33:47.268 "rw_ios_per_sec": 0, 00:33:47.268 "rw_mbytes_per_sec": 0, 00:33:47.268 "r_mbytes_per_sec": 0, 00:33:47.268 "w_mbytes_per_sec": 0 00:33:47.268 }, 00:33:47.268 "claimed": true, 00:33:47.268 "claim_type": "exclusive_write", 00:33:47.268 "zoned": false, 00:33:47.268 "supported_io_types": { 00:33:47.268 "read": true, 00:33:47.268 "write": true, 00:33:47.268 "unmap": true, 00:33:47.268 "flush": true, 00:33:47.268 "reset": true, 00:33:47.268 "nvme_admin": false, 00:33:47.268 "nvme_io": false, 00:33:47.268 "nvme_io_md": false, 00:33:47.268 "write_zeroes": true, 00:33:47.268 "zcopy": true, 00:33:47.268 "get_zone_info": false, 00:33:47.268 "zone_management": false, 00:33:47.268 "zone_append": false, 00:33:47.268 "compare": false, 00:33:47.268 "compare_and_write": false, 00:33:47.268 "abort": true, 00:33:47.268 "seek_hole": false, 00:33:47.268 "seek_data": false, 00:33:47.268 "copy": true, 00:33:47.268 "nvme_iov_md": false 00:33:47.268 }, 00:33:47.268 "memory_domains": [ 00:33:47.268 { 00:33:47.268 "dma_device_id": "system", 00:33:47.268 "dma_device_type": 1 00:33:47.268 }, 00:33:47.268 { 00:33:47.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:47.268 "dma_device_type": 2 00:33:47.268 } 00:33:47.268 ], 00:33:47.268 "driver_specific": { 00:33:47.268 "passthru": { 00:33:47.268 "name": "pt1", 00:33:47.268 "base_bdev_name": "malloc1" 00:33:47.268 } 00:33:47.268 } 00:33:47.268 }' 00:33:47.268 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:47.268 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:47.268 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:33:47.268 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:47.526 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:47.526 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:33:47.526 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:47.526 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:47.526 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:33:47.526 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:47.785 17:27:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:47.785 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:33:47.785 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:47.785 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:33:47.785 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:48.353 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:48.353 "name": "pt2", 00:33:48.353 "aliases": [ 00:33:48.353 "00000000-0000-0000-0000-000000000002" 00:33:48.353 ], 00:33:48.353 "product_name": "passthru", 00:33:48.353 "block_size": 4128, 00:33:48.353 "num_blocks": 8192, 00:33:48.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:48.353 "md_size": 32, 00:33:48.353 "md_interleave": true, 00:33:48.353 "dif_type": 0, 00:33:48.354 "assigned_rate_limits": { 00:33:48.354 "rw_ios_per_sec": 0, 00:33:48.354 "rw_mbytes_per_sec": 0, 00:33:48.354 "r_mbytes_per_sec": 0, 00:33:48.354 "w_mbytes_per_sec": 0 00:33:48.354 }, 00:33:48.354 "claimed": true, 00:33:48.354 "claim_type": "exclusive_write", 00:33:48.354 "zoned": false, 00:33:48.354 "supported_io_types": { 00:33:48.354 "read": true, 00:33:48.354 "write": true, 00:33:48.354 "unmap": true, 00:33:48.354 "flush": true, 00:33:48.354 "reset": true, 00:33:48.354 "nvme_admin": false, 00:33:48.354 "nvme_io": false, 00:33:48.354 "nvme_io_md": false, 00:33:48.354 "write_zeroes": true, 00:33:48.354 "zcopy": true, 00:33:48.354 "get_zone_info": false, 00:33:48.354 "zone_management": false, 00:33:48.354 "zone_append": false, 00:33:48.354 "compare": false, 00:33:48.354 "compare_and_write": false, 00:33:48.354 "abort": true, 00:33:48.354 "seek_hole": false, 00:33:48.354 "seek_data": false, 00:33:48.354 "copy": true, 00:33:48.354 "nvme_iov_md": false 00:33:48.354 }, 00:33:48.354 "memory_domains": [ 00:33:48.354 { 00:33:48.354 "dma_device_id": "system", 00:33:48.354 "dma_device_type": 1 00:33:48.354 }, 00:33:48.354 { 00:33:48.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:48.354 "dma_device_type": 2 00:33:48.354 } 00:33:48.354 ], 00:33:48.354 "driver_specific": { 00:33:48.354 "passthru": { 00:33:48.354 "name": "pt2", 00:33:48.354 "base_bdev_name": "malloc2" 00:33:48.354 } 00:33:48.354 } 00:33:48.354 }' 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:33:48.354 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:48.612 17:27:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:33:48.872 [2024-07-23 17:27:44.151679] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:48.872 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=90df058d-086a-4f8a-93ac-72bab19468fb 00:33:48.872 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 90df058d-086a-4f8a-93ac-72bab19468fb ']' 00:33:48.872 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:49.130 [2024-07-23 17:27:44.444202] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:49.130 [2024-07-23 17:27:44.444223] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:49.130 [2024-07-23 17:27:44.444276] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:49.130 [2024-07-23 17:27:44.444330] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:49.130 [2024-07-23 17:27:44.444342] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2313850 name raid_bdev1, state offline 00:33:49.130 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:49.130 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:33:49.388 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:33:49.388 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:33:49.388 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:49.388 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:33:49.647 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:33:49.647 17:27:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:49.906 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:33:49.906 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:33:50.164 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:33:50.164 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:33:50.164 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:33:50.164 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:33:50.165 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:33:50.732 [2024-07-23 17:27:45.964136] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:33:50.732 [2024-07-23 17:27:45.965591] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:33:50.732 [2024-07-23 17:27:45.965645] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:33:50.732 [2024-07-23 17:27:45.965683] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:33:50.732 [2024-07-23 17:27:45.965702] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:50.732 [2024-07-23 17:27:45.965712] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23194d0 name raid_bdev1, state configuring 00:33:50.732 request: 00:33:50.732 { 00:33:50.732 "name": "raid_bdev1", 00:33:50.732 "raid_level": "raid1", 00:33:50.732 "base_bdevs": [ 00:33:50.732 "malloc1", 00:33:50.732 "malloc2" 00:33:50.732 ], 00:33:50.732 "superblock": false, 00:33:50.732 "method": "bdev_raid_create", 00:33:50.732 "req_id": 1 00:33:50.732 } 00:33:50.732 Got JSON-RPC error response 00:33:50.732 response: 00:33:50.732 { 00:33:50.732 "code": -17, 00:33:50.732 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:33:50.732 } 00:33:50.732 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:33:50.732 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:33:50.732 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:33:50.732 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:33:50.732 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:50.732 17:27:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:33:51.300 17:27:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:33:51.300 17:27:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:33:51.300 17:27:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:33:51.869 [2024-07-23 17:27:46.990764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:33:51.869 [2024-07-23 17:27:46.990812] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:51.869 [2024-07-23 17:27:46.990830] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21fe720 00:33:51.869 [2024-07-23 17:27:46.990843] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:51.869 [2024-07-23 17:27:46.992305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:51.869 [2024-07-23 17:27:46.992339] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:33:51.869 [2024-07-23 17:27:46.992387] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:33:51.869 [2024-07-23 17:27:46.992411] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:33:51.869 pt1 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:51.869 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:52.127 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:52.128 "name": "raid_bdev1", 00:33:52.128 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:52.128 "strip_size_kb": 0, 00:33:52.128 "state": "configuring", 00:33:52.128 "raid_level": "raid1", 00:33:52.128 "superblock": true, 00:33:52.128 "num_base_bdevs": 2, 00:33:52.128 "num_base_bdevs_discovered": 1, 00:33:52.128 "num_base_bdevs_operational": 2, 00:33:52.128 "base_bdevs_list": [ 00:33:52.128 { 00:33:52.128 "name": "pt1", 00:33:52.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:52.128 "is_configured": true, 00:33:52.128 "data_offset": 256, 00:33:52.128 "data_size": 7936 00:33:52.128 }, 00:33:52.128 { 00:33:52.128 "name": null, 00:33:52.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:52.128 "is_configured": false, 00:33:52.128 "data_offset": 256, 00:33:52.128 "data_size": 7936 00:33:52.128 } 00:33:52.128 ] 00:33:52.128 }' 00:33:52.128 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:52.128 17:27:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:53.064 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:33:53.064 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:33:53.064 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:53.064 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:53.324 [2024-07-23 17:27:48.735408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:53.324 [2024-07-23 17:27:48.735459] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:53.324 [2024-07-23 17:27:48.735477] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2316ed0 00:33:53.324 [2024-07-23 17:27:48.735490] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:53.324 [2024-07-23 17:27:48.735645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:53.324 [2024-07-23 17:27:48.735661] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:53.324 [2024-07-23 17:27:48.735704] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:53.324 [2024-07-23 17:27:48.735727] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:53.324 [2024-07-23 17:27:48.735810] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2317fb0 00:33:53.324 [2024-07-23 17:27:48.735821] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:33:53.324 [2024-07-23 17:27:48.735875] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2318460 00:33:53.324 [2024-07-23 17:27:48.735958] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2317fb0 00:33:53.324 [2024-07-23 17:27:48.735968] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2317fb0 00:33:53.324 [2024-07-23 17:27:48.736023] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:53.324 pt2 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:53.583 17:27:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:54.150 17:27:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:54.150 "name": "raid_bdev1", 00:33:54.150 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:54.150 "strip_size_kb": 0, 00:33:54.150 "state": "online", 00:33:54.150 "raid_level": "raid1", 00:33:54.150 "superblock": true, 00:33:54.150 "num_base_bdevs": 2, 00:33:54.150 "num_base_bdevs_discovered": 2, 00:33:54.150 "num_base_bdevs_operational": 2, 00:33:54.150 "base_bdevs_list": [ 00:33:54.150 { 00:33:54.150 "name": "pt1", 00:33:54.150 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:54.150 "is_configured": true, 00:33:54.150 "data_offset": 256, 00:33:54.150 "data_size": 7936 00:33:54.150 }, 00:33:54.150 { 00:33:54.150 "name": "pt2", 00:33:54.150 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:54.150 "is_configured": true, 00:33:54.150 "data_offset": 256, 00:33:54.150 "data_size": 7936 00:33:54.150 } 00:33:54.150 ] 00:33:54.150 }' 00:33:54.150 17:27:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:54.150 17:27:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:55.085 [2024-07-23 17:27:50.384051] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:55.085 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:33:55.085 "name": "raid_bdev1", 00:33:55.085 "aliases": [ 00:33:55.085 "90df058d-086a-4f8a-93ac-72bab19468fb" 00:33:55.085 ], 00:33:55.085 "product_name": "Raid Volume", 00:33:55.085 "block_size": 4128, 00:33:55.085 "num_blocks": 7936, 00:33:55.085 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:55.085 "md_size": 32, 00:33:55.085 "md_interleave": true, 00:33:55.085 "dif_type": 0, 00:33:55.085 "assigned_rate_limits": { 00:33:55.085 "rw_ios_per_sec": 0, 00:33:55.085 "rw_mbytes_per_sec": 0, 00:33:55.086 "r_mbytes_per_sec": 0, 00:33:55.086 "w_mbytes_per_sec": 0 00:33:55.086 }, 00:33:55.086 "claimed": false, 00:33:55.086 "zoned": false, 00:33:55.086 "supported_io_types": { 00:33:55.086 "read": true, 00:33:55.086 "write": true, 00:33:55.086 "unmap": false, 00:33:55.086 "flush": false, 00:33:55.086 "reset": true, 00:33:55.086 "nvme_admin": false, 00:33:55.086 "nvme_io": false, 00:33:55.086 "nvme_io_md": false, 00:33:55.086 "write_zeroes": true, 00:33:55.086 "zcopy": false, 00:33:55.086 "get_zone_info": false, 00:33:55.086 "zone_management": false, 00:33:55.086 "zone_append": false, 00:33:55.086 "compare": false, 00:33:55.086 "compare_and_write": false, 00:33:55.086 "abort": false, 00:33:55.086 "seek_hole": false, 00:33:55.086 "seek_data": false, 00:33:55.086 "copy": false, 00:33:55.086 "nvme_iov_md": false 00:33:55.086 }, 00:33:55.086 "memory_domains": [ 00:33:55.086 { 00:33:55.086 "dma_device_id": "system", 00:33:55.086 "dma_device_type": 1 00:33:55.086 }, 00:33:55.086 { 00:33:55.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:55.086 "dma_device_type": 2 00:33:55.086 }, 00:33:55.086 { 00:33:55.086 "dma_device_id": "system", 00:33:55.086 "dma_device_type": 1 00:33:55.086 }, 00:33:55.086 { 00:33:55.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:55.086 "dma_device_type": 2 00:33:55.086 } 00:33:55.086 ], 00:33:55.086 "driver_specific": { 00:33:55.086 "raid": { 00:33:55.086 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:55.086 "strip_size_kb": 0, 00:33:55.086 "state": "online", 00:33:55.086 "raid_level": "raid1", 00:33:55.086 "superblock": true, 00:33:55.086 "num_base_bdevs": 2, 00:33:55.086 "num_base_bdevs_discovered": 2, 00:33:55.086 "num_base_bdevs_operational": 2, 00:33:55.086 "base_bdevs_list": [ 00:33:55.086 { 00:33:55.086 "name": "pt1", 00:33:55.086 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:55.086 "is_configured": true, 00:33:55.086 "data_offset": 256, 00:33:55.086 "data_size": 7936 00:33:55.086 }, 00:33:55.086 { 00:33:55.086 "name": "pt2", 00:33:55.086 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:55.086 "is_configured": true, 00:33:55.086 "data_offset": 256, 00:33:55.086 "data_size": 7936 00:33:55.086 } 00:33:55.086 ] 00:33:55.086 } 00:33:55.086 } 00:33:55.086 }' 00:33:55.086 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:33:55.086 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:33:55.086 pt2' 00:33:55.086 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:55.086 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:33:55.086 17:27:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:55.653 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:55.653 "name": "pt1", 00:33:55.653 "aliases": [ 00:33:55.653 "00000000-0000-0000-0000-000000000001" 00:33:55.653 ], 00:33:55.653 "product_name": "passthru", 00:33:55.653 "block_size": 4128, 00:33:55.653 "num_blocks": 8192, 00:33:55.653 "uuid": "00000000-0000-0000-0000-000000000001", 00:33:55.653 "md_size": 32, 00:33:55.653 "md_interleave": true, 00:33:55.653 "dif_type": 0, 00:33:55.653 "assigned_rate_limits": { 00:33:55.653 "rw_ios_per_sec": 0, 00:33:55.653 "rw_mbytes_per_sec": 0, 00:33:55.653 "r_mbytes_per_sec": 0, 00:33:55.653 "w_mbytes_per_sec": 0 00:33:55.653 }, 00:33:55.653 "claimed": true, 00:33:55.653 "claim_type": "exclusive_write", 00:33:55.653 "zoned": false, 00:33:55.653 "supported_io_types": { 00:33:55.653 "read": true, 00:33:55.653 "write": true, 00:33:55.653 "unmap": true, 00:33:55.653 "flush": true, 00:33:55.653 "reset": true, 00:33:55.653 "nvme_admin": false, 00:33:55.653 "nvme_io": false, 00:33:55.653 "nvme_io_md": false, 00:33:55.653 "write_zeroes": true, 00:33:55.653 "zcopy": true, 00:33:55.653 "get_zone_info": false, 00:33:55.653 "zone_management": false, 00:33:55.653 "zone_append": false, 00:33:55.653 "compare": false, 00:33:55.653 "compare_and_write": false, 00:33:55.653 "abort": true, 00:33:55.653 "seek_hole": false, 00:33:55.653 "seek_data": false, 00:33:55.653 "copy": true, 00:33:55.653 "nvme_iov_md": false 00:33:55.653 }, 00:33:55.653 "memory_domains": [ 00:33:55.653 { 00:33:55.653 "dma_device_id": "system", 00:33:55.653 "dma_device_type": 1 00:33:55.653 }, 00:33:55.653 { 00:33:55.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:55.653 "dma_device_type": 2 00:33:55.653 } 00:33:55.653 ], 00:33:55.653 "driver_specific": { 00:33:55.653 "passthru": { 00:33:55.653 "name": "pt1", 00:33:55.653 "base_bdev_name": "malloc1" 00:33:55.653 } 00:33:55.653 } 00:33:55.653 }' 00:33:55.653 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:33:55.912 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:56.171 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:56.171 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:33:56.171 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:33:56.171 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:33:56.171 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:33:56.429 "name": "pt2", 00:33:56.429 "aliases": [ 00:33:56.429 "00000000-0000-0000-0000-000000000002" 00:33:56.429 ], 00:33:56.429 "product_name": "passthru", 00:33:56.429 "block_size": 4128, 00:33:56.429 "num_blocks": 8192, 00:33:56.429 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:56.429 "md_size": 32, 00:33:56.429 "md_interleave": true, 00:33:56.429 "dif_type": 0, 00:33:56.429 "assigned_rate_limits": { 00:33:56.429 "rw_ios_per_sec": 0, 00:33:56.429 "rw_mbytes_per_sec": 0, 00:33:56.429 "r_mbytes_per_sec": 0, 00:33:56.429 "w_mbytes_per_sec": 0 00:33:56.429 }, 00:33:56.429 "claimed": true, 00:33:56.429 "claim_type": "exclusive_write", 00:33:56.429 "zoned": false, 00:33:56.429 "supported_io_types": { 00:33:56.429 "read": true, 00:33:56.429 "write": true, 00:33:56.429 "unmap": true, 00:33:56.429 "flush": true, 00:33:56.429 "reset": true, 00:33:56.429 "nvme_admin": false, 00:33:56.429 "nvme_io": false, 00:33:56.429 "nvme_io_md": false, 00:33:56.429 "write_zeroes": true, 00:33:56.429 "zcopy": true, 00:33:56.429 "get_zone_info": false, 00:33:56.429 "zone_management": false, 00:33:56.429 "zone_append": false, 00:33:56.429 "compare": false, 00:33:56.429 "compare_and_write": false, 00:33:56.429 "abort": true, 00:33:56.429 "seek_hole": false, 00:33:56.429 "seek_data": false, 00:33:56.429 "copy": true, 00:33:56.429 "nvme_iov_md": false 00:33:56.429 }, 00:33:56.429 "memory_domains": [ 00:33:56.429 { 00:33:56.429 "dma_device_id": "system", 00:33:56.429 "dma_device_type": 1 00:33:56.429 }, 00:33:56.429 { 00:33:56.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:56.429 "dma_device_type": 2 00:33:56.429 } 00:33:56.429 ], 00:33:56.429 "driver_specific": { 00:33:56.429 "passthru": { 00:33:56.429 "name": "pt2", 00:33:56.429 "base_bdev_name": "malloc2" 00:33:56.429 } 00:33:56.429 } 00:33:56.429 }' 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:56.429 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:33:56.688 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:33:56.688 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:56.688 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:33:56.688 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:33:56.688 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:33:56.688 17:27:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:33:56.947 [2024-07-23 17:27:52.188816] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:33:56.947 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 90df058d-086a-4f8a-93ac-72bab19468fb '!=' 90df058d-086a-4f8a-93ac-72bab19468fb ']' 00:33:56.947 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:33:56.947 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:33:56.947 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:33:56.947 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:33:57.206 [2024-07-23 17:27:52.481368] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:57.206 17:27:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:57.774 17:27:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:57.774 "name": "raid_bdev1", 00:33:57.774 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:57.774 "strip_size_kb": 0, 00:33:57.774 "state": "online", 00:33:57.774 "raid_level": "raid1", 00:33:57.774 "superblock": true, 00:33:57.774 "num_base_bdevs": 2, 00:33:57.774 "num_base_bdevs_discovered": 1, 00:33:57.774 "num_base_bdevs_operational": 1, 00:33:57.774 "base_bdevs_list": [ 00:33:57.774 { 00:33:57.774 "name": null, 00:33:57.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:57.774 "is_configured": false, 00:33:57.774 "data_offset": 256, 00:33:57.774 "data_size": 7936 00:33:57.774 }, 00:33:57.774 { 00:33:57.774 "name": "pt2", 00:33:57.774 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:57.774 "is_configured": true, 00:33:57.774 "data_offset": 256, 00:33:57.774 "data_size": 7936 00:33:57.774 } 00:33:57.774 ] 00:33:57.774 }' 00:33:57.774 17:27:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:57.774 17:27:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:33:58.342 17:27:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:33:58.601 [2024-07-23 17:27:53.885073] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:33:58.601 [2024-07-23 17:27:53.885100] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:33:58.601 [2024-07-23 17:27:53.885148] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:33:58.601 [2024-07-23 17:27:53.885191] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:33:58.601 [2024-07-23 17:27:53.885205] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2317fb0 name raid_bdev1, state offline 00:33:58.601 17:27:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:58.601 17:27:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:33:58.859 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:33:58.859 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:33:58.859 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:33:58.859 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:33:58.859 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:33:59.118 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:33:59.118 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:33:59.118 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:33:59.118 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:33:59.118 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:33:59.118 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:33:59.379 [2024-07-23 17:27:54.574861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:33:59.379 [2024-07-23 17:27:54.574910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:59.379 [2024-07-23 17:27:54.574927] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23172a0 00:33:59.379 [2024-07-23 17:27:54.574940] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:59.379 [2024-07-23 17:27:54.576322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:59.379 [2024-07-23 17:27:54.576350] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:33:59.379 [2024-07-23 17:27:54.576397] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:33:59.379 [2024-07-23 17:27:54.576422] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:33:59.379 [2024-07-23 17:27:54.576487] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23175c0 00:33:59.379 [2024-07-23 17:27:54.576497] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:33:59.379 [2024-07-23 17:27:54.576551] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2315f00 00:33:59.379 [2024-07-23 17:27:54.576621] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23175c0 00:33:59.379 [2024-07-23 17:27:54.576631] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23175c0 00:33:59.379 [2024-07-23 17:27:54.576685] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:33:59.379 pt2 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:33:59.379 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:33:59.638 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:33:59.638 "name": "raid_bdev1", 00:33:59.638 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:33:59.638 "strip_size_kb": 0, 00:33:59.638 "state": "online", 00:33:59.638 "raid_level": "raid1", 00:33:59.638 "superblock": true, 00:33:59.638 "num_base_bdevs": 2, 00:33:59.638 "num_base_bdevs_discovered": 1, 00:33:59.638 "num_base_bdevs_operational": 1, 00:33:59.638 "base_bdevs_list": [ 00:33:59.638 { 00:33:59.638 "name": null, 00:33:59.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:33:59.638 "is_configured": false, 00:33:59.638 "data_offset": 256, 00:33:59.638 "data_size": 7936 00:33:59.638 }, 00:33:59.638 { 00:33:59.638 "name": "pt2", 00:33:59.638 "uuid": "00000000-0000-0000-0000-000000000002", 00:33:59.638 "is_configured": true, 00:33:59.638 "data_offset": 256, 00:33:59.638 "data_size": 7936 00:33:59.638 } 00:33:59.638 ] 00:33:59.638 }' 00:33:59.638 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:33:59.638 17:27:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:00.206 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:00.465 [2024-07-23 17:27:55.669758] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:00.465 [2024-07-23 17:27:55.669782] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:00.465 [2024-07-23 17:27:55.669832] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:00.465 [2024-07-23 17:27:55.669874] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:00.465 [2024-07-23 17:27:55.669886] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23175c0 name raid_bdev1, state offline 00:34:00.465 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:00.465 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:34:00.723 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:34:00.723 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:34:00.723 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:34:00.723 17:27:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:34:00.982 [2024-07-23 17:27:56.167054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:34:00.982 [2024-07-23 17:27:56.167101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:00.982 [2024-07-23 17:27:56.167119] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2315d00 00:34:00.982 [2024-07-23 17:27:56.167132] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:00.982 [2024-07-23 17:27:56.168515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:00.982 [2024-07-23 17:27:56.168541] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:34:00.982 [2024-07-23 17:27:56.168584] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:34:00.982 [2024-07-23 17:27:56.168606] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:34:00.982 [2024-07-23 17:27:56.168685] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:34:00.982 [2024-07-23 17:27:56.168697] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:00.982 [2024-07-23 17:27:56.168712] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2316410 name raid_bdev1, state configuring 00:34:00.982 [2024-07-23 17:27:56.168733] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:34:00.982 [2024-07-23 17:27:56.168784] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2315f50 00:34:00.982 [2024-07-23 17:27:56.168794] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:34:00.982 [2024-07-23 17:27:56.168843] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2316b00 00:34:00.982 [2024-07-23 17:27:56.168919] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2315f50 00:34:00.982 [2024-07-23 17:27:56.168929] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2315f50 00:34:00.982 [2024-07-23 17:27:56.168988] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:00.982 pt1 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:00.982 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:01.241 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:01.241 "name": "raid_bdev1", 00:34:01.241 "uuid": "90df058d-086a-4f8a-93ac-72bab19468fb", 00:34:01.241 "strip_size_kb": 0, 00:34:01.241 "state": "online", 00:34:01.241 "raid_level": "raid1", 00:34:01.241 "superblock": true, 00:34:01.241 "num_base_bdevs": 2, 00:34:01.241 "num_base_bdevs_discovered": 1, 00:34:01.241 "num_base_bdevs_operational": 1, 00:34:01.241 "base_bdevs_list": [ 00:34:01.241 { 00:34:01.241 "name": null, 00:34:01.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:01.241 "is_configured": false, 00:34:01.241 "data_offset": 256, 00:34:01.241 "data_size": 7936 00:34:01.241 }, 00:34:01.241 { 00:34:01.241 "name": "pt2", 00:34:01.241 "uuid": "00000000-0000-0000-0000-000000000002", 00:34:01.241 "is_configured": true, 00:34:01.241 "data_offset": 256, 00:34:01.241 "data_size": 7936 00:34:01.241 } 00:34:01.241 ] 00:34:01.241 }' 00:34:01.241 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:01.241 17:27:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:02.178 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:34:02.178 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:34:02.436 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:34:02.436 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:34:02.436 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:34:02.694 [2024-07-23 17:27:57.879834] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:02.694 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 90df058d-086a-4f8a-93ac-72bab19468fb '!=' 90df058d-086a-4f8a-93ac-72bab19468fb ']' 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 78570 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 78570 ']' 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 78570 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 78570 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 78570' 00:34:02.695 killing process with pid 78570 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 78570 00:34:02.695 [2024-07-23 17:27:57.949590] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:02.695 [2024-07-23 17:27:57.949643] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:02.695 [2024-07-23 17:27:57.949686] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:02.695 [2024-07-23 17:27:57.949697] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2315f50 name raid_bdev1, state offline 00:34:02.695 17:27:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 78570 00:34:02.695 [2024-07-23 17:27:57.969618] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:02.953 17:27:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:34:02.953 00:34:02.953 real 0m21.106s 00:34:02.953 user 0m38.679s 00:34:02.953 sys 0m3.574s 00:34:02.953 17:27:58 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:02.953 17:27:58 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:02.953 ************************************ 00:34:02.953 END TEST raid_superblock_test_md_interleaved 00:34:02.953 ************************************ 00:34:02.953 17:27:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:34:02.953 17:27:58 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:34:02.953 17:27:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:02.953 17:27:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:02.953 17:27:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:02.953 ************************************ 00:34:02.953 START TEST raid_rebuild_test_sb_md_interleaved 00:34:02.953 ************************************ 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:34:02.953 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=81631 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 81631 /var/tmp/spdk-raid.sock 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 81631 ']' 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:34:02.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:02.954 17:27:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:02.954 [2024-07-23 17:27:58.337013] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:02.954 [2024-07-23 17:27:58.337080] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid81631 ] 00:34:02.954 I/O size of 3145728 is greater than zero copy threshold (65536). 00:34:02.954 Zero copy mechanism will not be used. 00:34:03.212 [2024-07-23 17:27:58.469979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:03.212 [2024-07-23 17:27:58.525671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:03.212 [2024-07-23 17:27:58.591171] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:03.212 [2024-07-23 17:27:58.591236] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:34:04.149 17:27:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:04.149 17:27:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:34:04.149 17:27:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:34:04.149 17:27:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:34:04.717 BaseBdev1_malloc 00:34:04.717 17:28:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:34:04.976 [2024-07-23 17:28:00.289636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:34:04.976 [2024-07-23 17:28:00.289686] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:04.976 [2024-07-23 17:28:00.289712] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3d3a0 00:34:04.976 [2024-07-23 17:28:00.289724] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:04.976 [2024-07-23 17:28:00.291180] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:04.976 [2024-07-23 17:28:00.291206] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:04.976 BaseBdev1 00:34:04.976 17:28:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:34:04.976 17:28:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:34:05.543 BaseBdev2_malloc 00:34:05.543 17:28:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:34:06.111 [2024-07-23 17:28:01.330604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:34:06.111 [2024-07-23 17:28:01.330658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:06.111 [2024-07-23 17:28:01.330680] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe228e0 00:34:06.111 [2024-07-23 17:28:01.330692] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:06.111 [2024-07-23 17:28:01.332221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:06.111 [2024-07-23 17:28:01.332247] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:34:06.111 BaseBdev2 00:34:06.111 17:28:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:34:06.370 spare_malloc 00:34:06.370 17:28:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:34:06.629 spare_delay 00:34:06.629 17:28:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:06.629 [2024-07-23 17:28:02.049475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:06.629 [2024-07-23 17:28:02.049518] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:06.629 [2024-07-23 17:28:02.049539] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3a110 00:34:06.629 [2024-07-23 17:28:02.049551] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:06.629 [2024-07-23 17:28:02.050827] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:06.629 [2024-07-23 17:28:02.050852] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:06.888 spare 00:34:06.888 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:34:07.147 [2024-07-23 17:28:02.558844] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:07.147 [2024-07-23 17:28:02.560180] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:07.147 [2024-07-23 17:28:02.560339] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf3a7b0 00:34:07.147 [2024-07-23 17:28:02.560352] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:34:07.147 [2024-07-23 17:28:02.560421] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xda0890 00:34:07.147 [2024-07-23 17:28:02.560502] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf3a7b0 00:34:07.147 [2024-07-23 17:28:02.560512] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf3a7b0 00:34:07.147 [2024-07-23 17:28:02.560569] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:07.406 17:28:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:07.974 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:07.974 "name": "raid_bdev1", 00:34:07.974 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:07.974 "strip_size_kb": 0, 00:34:07.974 "state": "online", 00:34:07.974 "raid_level": "raid1", 00:34:07.974 "superblock": true, 00:34:07.974 "num_base_bdevs": 2, 00:34:07.974 "num_base_bdevs_discovered": 2, 00:34:07.974 "num_base_bdevs_operational": 2, 00:34:07.974 "base_bdevs_list": [ 00:34:07.974 { 00:34:07.974 "name": "BaseBdev1", 00:34:07.974 "uuid": "8b320ba2-bba4-5653-99f4-3513ded1d021", 00:34:07.974 "is_configured": true, 00:34:07.974 "data_offset": 256, 00:34:07.974 "data_size": 7936 00:34:07.974 }, 00:34:07.974 { 00:34:07.974 "name": "BaseBdev2", 00:34:07.974 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:07.974 "is_configured": true, 00:34:07.974 "data_offset": 256, 00:34:07.974 "data_size": 7936 00:34:07.974 } 00:34:07.974 ] 00:34:07.974 }' 00:34:07.974 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:07.974 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:08.541 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:34:08.541 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:34:08.541 [2024-07-23 17:28:03.938719] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:34:08.541 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:34:08.541 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:08.541 17:28:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:34:08.800 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:34:08.800 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:34:08.800 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:34:08.800 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:34:09.058 [2024-07-23 17:28:04.431766] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:09.058 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:09.317 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:09.317 "name": "raid_bdev1", 00:34:09.317 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:09.317 "strip_size_kb": 0, 00:34:09.317 "state": "online", 00:34:09.317 "raid_level": "raid1", 00:34:09.317 "superblock": true, 00:34:09.317 "num_base_bdevs": 2, 00:34:09.317 "num_base_bdevs_discovered": 1, 00:34:09.317 "num_base_bdevs_operational": 1, 00:34:09.317 "base_bdevs_list": [ 00:34:09.317 { 00:34:09.317 "name": null, 00:34:09.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:09.317 "is_configured": false, 00:34:09.317 "data_offset": 256, 00:34:09.317 "data_size": 7936 00:34:09.317 }, 00:34:09.317 { 00:34:09.317 "name": "BaseBdev2", 00:34:09.317 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:09.317 "is_configured": true, 00:34:09.317 "data_offset": 256, 00:34:09.317 "data_size": 7936 00:34:09.317 } 00:34:09.317 ] 00:34:09.317 }' 00:34:09.317 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:09.317 17:28:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:10.253 17:28:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:10.253 [2024-07-23 17:28:05.582828] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:10.253 [2024-07-23 17:28:05.586498] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xda0cc0 00:34:10.253 [2024-07-23 17:28:05.588800] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:10.253 17:28:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:11.629 "name": "raid_bdev1", 00:34:11.629 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:11.629 "strip_size_kb": 0, 00:34:11.629 "state": "online", 00:34:11.629 "raid_level": "raid1", 00:34:11.629 "superblock": true, 00:34:11.629 "num_base_bdevs": 2, 00:34:11.629 "num_base_bdevs_discovered": 2, 00:34:11.629 "num_base_bdevs_operational": 2, 00:34:11.629 "process": { 00:34:11.629 "type": "rebuild", 00:34:11.629 "target": "spare", 00:34:11.629 "progress": { 00:34:11.629 "blocks": 3072, 00:34:11.629 "percent": 38 00:34:11.629 } 00:34:11.629 }, 00:34:11.629 "base_bdevs_list": [ 00:34:11.629 { 00:34:11.629 "name": "spare", 00:34:11.629 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:11.629 "is_configured": true, 00:34:11.629 "data_offset": 256, 00:34:11.629 "data_size": 7936 00:34:11.629 }, 00:34:11.629 { 00:34:11.629 "name": "BaseBdev2", 00:34:11.629 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:11.629 "is_configured": true, 00:34:11.629 "data_offset": 256, 00:34:11.629 "data_size": 7936 00:34:11.629 } 00:34:11.629 ] 00:34:11.629 }' 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:11.629 17:28:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:34:11.888 [2024-07-23 17:28:07.178196] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:11.888 [2024-07-23 17:28:07.201201] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:11.888 [2024-07-23 17:28:07.201245] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:11.888 [2024-07-23 17:28:07.201260] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:11.888 [2024-07-23 17:28:07.201268] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:11.888 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:12.146 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:12.146 "name": "raid_bdev1", 00:34:12.146 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:12.146 "strip_size_kb": 0, 00:34:12.146 "state": "online", 00:34:12.146 "raid_level": "raid1", 00:34:12.146 "superblock": true, 00:34:12.146 "num_base_bdevs": 2, 00:34:12.146 "num_base_bdevs_discovered": 1, 00:34:12.146 "num_base_bdevs_operational": 1, 00:34:12.146 "base_bdevs_list": [ 00:34:12.146 { 00:34:12.146 "name": null, 00:34:12.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:12.146 "is_configured": false, 00:34:12.146 "data_offset": 256, 00:34:12.146 "data_size": 7936 00:34:12.146 }, 00:34:12.146 { 00:34:12.146 "name": "BaseBdev2", 00:34:12.146 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:12.146 "is_configured": true, 00:34:12.146 "data_offset": 256, 00:34:12.146 "data_size": 7936 00:34:12.146 } 00:34:12.146 ] 00:34:12.146 }' 00:34:12.146 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:12.146 17:28:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:12.713 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:12.713 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:12.714 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:12.714 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:12.714 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:12.973 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:12.973 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:12.973 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:12.973 "name": "raid_bdev1", 00:34:12.973 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:12.973 "strip_size_kb": 0, 00:34:12.973 "state": "online", 00:34:12.973 "raid_level": "raid1", 00:34:12.973 "superblock": true, 00:34:12.973 "num_base_bdevs": 2, 00:34:12.973 "num_base_bdevs_discovered": 1, 00:34:12.973 "num_base_bdevs_operational": 1, 00:34:12.973 "base_bdevs_list": [ 00:34:12.973 { 00:34:12.973 "name": null, 00:34:12.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:12.973 "is_configured": false, 00:34:12.973 "data_offset": 256, 00:34:12.973 "data_size": 7936 00:34:12.973 }, 00:34:12.973 { 00:34:12.973 "name": "BaseBdev2", 00:34:12.973 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:12.973 "is_configured": true, 00:34:12.973 "data_offset": 256, 00:34:12.973 "data_size": 7936 00:34:12.973 } 00:34:12.973 ] 00:34:12.973 }' 00:34:12.973 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:13.232 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:13.232 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:13.232 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:13.232 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:13.491 [2024-07-23 17:28:08.697531] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:13.491 [2024-07-23 17:28:08.701007] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf3b810 00:34:13.491 [2024-07-23 17:28:08.702434] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:13.491 17:28:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:14.425 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:14.683 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:14.683 "name": "raid_bdev1", 00:34:14.683 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:14.683 "strip_size_kb": 0, 00:34:14.683 "state": "online", 00:34:14.683 "raid_level": "raid1", 00:34:14.684 "superblock": true, 00:34:14.684 "num_base_bdevs": 2, 00:34:14.684 "num_base_bdevs_discovered": 2, 00:34:14.684 "num_base_bdevs_operational": 2, 00:34:14.684 "process": { 00:34:14.684 "type": "rebuild", 00:34:14.684 "target": "spare", 00:34:14.684 "progress": { 00:34:14.684 "blocks": 3072, 00:34:14.684 "percent": 38 00:34:14.684 } 00:34:14.684 }, 00:34:14.684 "base_bdevs_list": [ 00:34:14.684 { 00:34:14.684 "name": "spare", 00:34:14.684 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:14.684 "is_configured": true, 00:34:14.684 "data_offset": 256, 00:34:14.684 "data_size": 7936 00:34:14.684 }, 00:34:14.684 { 00:34:14.684 "name": "BaseBdev2", 00:34:14.684 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:14.684 "is_configured": true, 00:34:14.684 "data_offset": 256, 00:34:14.684 "data_size": 7936 00:34:14.684 } 00:34:14.684 ] 00:34:14.684 }' 00:34:14.684 17:28:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:34:14.684 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1190 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:14.684 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:14.942 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:14.942 "name": "raid_bdev1", 00:34:14.942 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:14.942 "strip_size_kb": 0, 00:34:14.942 "state": "online", 00:34:14.942 "raid_level": "raid1", 00:34:14.942 "superblock": true, 00:34:14.942 "num_base_bdevs": 2, 00:34:14.942 "num_base_bdevs_discovered": 2, 00:34:14.942 "num_base_bdevs_operational": 2, 00:34:14.942 "process": { 00:34:14.942 "type": "rebuild", 00:34:14.942 "target": "spare", 00:34:14.942 "progress": { 00:34:14.942 "blocks": 3840, 00:34:14.942 "percent": 48 00:34:14.942 } 00:34:14.942 }, 00:34:14.942 "base_bdevs_list": [ 00:34:14.942 { 00:34:14.942 "name": "spare", 00:34:14.942 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:14.942 "is_configured": true, 00:34:14.942 "data_offset": 256, 00:34:14.942 "data_size": 7936 00:34:14.942 }, 00:34:14.942 { 00:34:14.942 "name": "BaseBdev2", 00:34:14.942 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:14.942 "is_configured": true, 00:34:14.942 "data_offset": 256, 00:34:14.942 "data_size": 7936 00:34:14.942 } 00:34:14.942 ] 00:34:14.942 }' 00:34:14.942 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:15.199 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:15.199 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:15.199 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:15.199 17:28:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:16.136 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:16.395 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:16.395 "name": "raid_bdev1", 00:34:16.395 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:16.395 "strip_size_kb": 0, 00:34:16.395 "state": "online", 00:34:16.395 "raid_level": "raid1", 00:34:16.395 "superblock": true, 00:34:16.395 "num_base_bdevs": 2, 00:34:16.395 "num_base_bdevs_discovered": 2, 00:34:16.395 "num_base_bdevs_operational": 2, 00:34:16.395 "process": { 00:34:16.395 "type": "rebuild", 00:34:16.395 "target": "spare", 00:34:16.395 "progress": { 00:34:16.395 "blocks": 7424, 00:34:16.395 "percent": 93 00:34:16.395 } 00:34:16.395 }, 00:34:16.395 "base_bdevs_list": [ 00:34:16.395 { 00:34:16.395 "name": "spare", 00:34:16.395 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:16.395 "is_configured": true, 00:34:16.395 "data_offset": 256, 00:34:16.395 "data_size": 7936 00:34:16.395 }, 00:34:16.395 { 00:34:16.395 "name": "BaseBdev2", 00:34:16.395 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:16.395 "is_configured": true, 00:34:16.395 "data_offset": 256, 00:34:16.395 "data_size": 7936 00:34:16.395 } 00:34:16.395 ] 00:34:16.395 }' 00:34:16.395 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:16.395 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:16.395 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:16.395 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:16.395 17:28:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:34:16.655 [2024-07-23 17:28:11.826632] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:34:16.655 [2024-07-23 17:28:11.826688] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:34:16.655 [2024-07-23 17:28:11.826771] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:17.629 17:28:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:17.629 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:17.629 "name": "raid_bdev1", 00:34:17.629 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:17.629 "strip_size_kb": 0, 00:34:17.629 "state": "online", 00:34:17.629 "raid_level": "raid1", 00:34:17.629 "superblock": true, 00:34:17.629 "num_base_bdevs": 2, 00:34:17.629 "num_base_bdevs_discovered": 2, 00:34:17.629 "num_base_bdevs_operational": 2, 00:34:17.629 "base_bdevs_list": [ 00:34:17.629 { 00:34:17.629 "name": "spare", 00:34:17.629 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:17.629 "is_configured": true, 00:34:17.629 "data_offset": 256, 00:34:17.629 "data_size": 7936 00:34:17.629 }, 00:34:17.629 { 00:34:17.629 "name": "BaseBdev2", 00:34:17.629 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:17.629 "is_configured": true, 00:34:17.629 "data_offset": 256, 00:34:17.629 "data_size": 7936 00:34:17.629 } 00:34:17.629 ] 00:34:17.629 }' 00:34:17.629 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:17.888 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:18.145 "name": "raid_bdev1", 00:34:18.145 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:18.145 "strip_size_kb": 0, 00:34:18.145 "state": "online", 00:34:18.145 "raid_level": "raid1", 00:34:18.145 "superblock": true, 00:34:18.145 "num_base_bdevs": 2, 00:34:18.145 "num_base_bdevs_discovered": 2, 00:34:18.145 "num_base_bdevs_operational": 2, 00:34:18.145 "base_bdevs_list": [ 00:34:18.145 { 00:34:18.145 "name": "spare", 00:34:18.145 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:18.145 "is_configured": true, 00:34:18.145 "data_offset": 256, 00:34:18.145 "data_size": 7936 00:34:18.145 }, 00:34:18.145 { 00:34:18.145 "name": "BaseBdev2", 00:34:18.145 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:18.145 "is_configured": true, 00:34:18.145 "data_offset": 256, 00:34:18.145 "data_size": 7936 00:34:18.145 } 00:34:18.145 ] 00:34:18.145 }' 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:18.145 17:28:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:18.711 17:28:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:18.711 "name": "raid_bdev1", 00:34:18.711 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:18.711 "strip_size_kb": 0, 00:34:18.711 "state": "online", 00:34:18.711 "raid_level": "raid1", 00:34:18.711 "superblock": true, 00:34:18.711 "num_base_bdevs": 2, 00:34:18.711 "num_base_bdevs_discovered": 2, 00:34:18.711 "num_base_bdevs_operational": 2, 00:34:18.711 "base_bdevs_list": [ 00:34:18.711 { 00:34:18.711 "name": "spare", 00:34:18.711 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:18.711 "is_configured": true, 00:34:18.711 "data_offset": 256, 00:34:18.711 "data_size": 7936 00:34:18.711 }, 00:34:18.711 { 00:34:18.711 "name": "BaseBdev2", 00:34:18.711 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:18.711 "is_configured": true, 00:34:18.711 "data_offset": 256, 00:34:18.711 "data_size": 7936 00:34:18.711 } 00:34:18.711 ] 00:34:18.711 }' 00:34:18.711 17:28:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:18.711 17:28:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:19.277 17:28:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:34:19.535 [2024-07-23 17:28:14.887355] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:34:19.535 [2024-07-23 17:28:14.887381] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:34:19.535 [2024-07-23 17:28:14.887438] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:19.535 [2024-07-23 17:28:14.887501] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:19.535 [2024-07-23 17:28:14.887514] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3a7b0 name raid_bdev1, state offline 00:34:19.535 17:28:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:19.535 17:28:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:34:19.794 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:34:19.794 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:34:19.794 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:34:19.794 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:34:20.052 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:20.310 [2024-07-23 17:28:15.681423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:20.310 [2024-07-23 17:28:15.681472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:20.310 [2024-07-23 17:28:15.681495] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9fe40 00:34:20.310 [2024-07-23 17:28:15.681507] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:20.310 [2024-07-23 17:28:15.683221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:20.310 [2024-07-23 17:28:15.683249] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:20.310 [2024-07-23 17:28:15.683306] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:34:20.310 [2024-07-23 17:28:15.683330] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:20.310 [2024-07-23 17:28:15.683413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:34:20.310 spare 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:20.310 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:20.311 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:20.311 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:20.311 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:20.311 17:28:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:20.569 [2024-07-23 17:28:15.783723] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf26650 00:34:20.569 [2024-07-23 17:28:15.783736] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:34:20.569 [2024-07-23 17:28:15.783812] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf276e0 00:34:20.569 [2024-07-23 17:28:15.783915] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf26650 00:34:20.569 [2024-07-23 17:28:15.783925] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf26650 00:34:20.569 [2024-07-23 17:28:15.783994] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:20.827 17:28:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:20.827 "name": "raid_bdev1", 00:34:20.827 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:20.827 "strip_size_kb": 0, 00:34:20.827 "state": "online", 00:34:20.827 "raid_level": "raid1", 00:34:20.827 "superblock": true, 00:34:20.827 "num_base_bdevs": 2, 00:34:20.827 "num_base_bdevs_discovered": 2, 00:34:20.827 "num_base_bdevs_operational": 2, 00:34:20.827 "base_bdevs_list": [ 00:34:20.828 { 00:34:20.828 "name": "spare", 00:34:20.828 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:20.828 "is_configured": true, 00:34:20.828 "data_offset": 256, 00:34:20.828 "data_size": 7936 00:34:20.828 }, 00:34:20.828 { 00:34:20.828 "name": "BaseBdev2", 00:34:20.828 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:20.828 "is_configured": true, 00:34:20.828 "data_offset": 256, 00:34:20.828 "data_size": 7936 00:34:20.828 } 00:34:20.828 ] 00:34:20.828 }' 00:34:20.828 17:28:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:20.828 17:28:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:21.763 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:21.763 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:21.763 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:21.763 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:21.763 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:22.021 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:22.021 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:22.280 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:22.280 "name": "raid_bdev1", 00:34:22.280 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:22.280 "strip_size_kb": 0, 00:34:22.280 "state": "online", 00:34:22.280 "raid_level": "raid1", 00:34:22.280 "superblock": true, 00:34:22.280 "num_base_bdevs": 2, 00:34:22.280 "num_base_bdevs_discovered": 2, 00:34:22.280 "num_base_bdevs_operational": 2, 00:34:22.280 "base_bdevs_list": [ 00:34:22.280 { 00:34:22.280 "name": "spare", 00:34:22.280 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:22.280 "is_configured": true, 00:34:22.280 "data_offset": 256, 00:34:22.280 "data_size": 7936 00:34:22.280 }, 00:34:22.280 { 00:34:22.280 "name": "BaseBdev2", 00:34:22.280 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:22.280 "is_configured": true, 00:34:22.280 "data_offset": 256, 00:34:22.280 "data_size": 7936 00:34:22.280 } 00:34:22.280 ] 00:34:22.280 }' 00:34:22.280 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:22.539 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:22.539 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:22.539 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:22.539 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:22.539 17:28:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:34:23.107 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:34:23.107 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:34:23.366 [2024-07-23 17:28:18.581297] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:23.366 17:28:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:23.935 17:28:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:23.935 "name": "raid_bdev1", 00:34:23.935 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:23.935 "strip_size_kb": 0, 00:34:23.935 "state": "online", 00:34:23.935 "raid_level": "raid1", 00:34:23.935 "superblock": true, 00:34:23.935 "num_base_bdevs": 2, 00:34:23.935 "num_base_bdevs_discovered": 1, 00:34:23.935 "num_base_bdevs_operational": 1, 00:34:23.935 "base_bdevs_list": [ 00:34:23.935 { 00:34:23.935 "name": null, 00:34:23.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:23.935 "is_configured": false, 00:34:23.935 "data_offset": 256, 00:34:23.935 "data_size": 7936 00:34:23.935 }, 00:34:23.935 { 00:34:23.935 "name": "BaseBdev2", 00:34:23.935 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:23.935 "is_configured": true, 00:34:23.935 "data_offset": 256, 00:34:23.935 "data_size": 7936 00:34:23.935 } 00:34:23.935 ] 00:34:23.935 }' 00:34:23.935 17:28:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:23.935 17:28:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:24.503 17:28:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:34:24.762 [2024-07-23 17:28:19.969000] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:24.762 [2024-07-23 17:28:19.969139] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:34:24.762 [2024-07-23 17:28:19.969155] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:34:24.762 [2024-07-23 17:28:19.969181] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:24.762 [2024-07-23 17:28:19.972539] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf27fb0 00:34:24.762 [2024-07-23 17:28:19.974835] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:24.762 17:28:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:25.698 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:25.957 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:25.957 "name": "raid_bdev1", 00:34:25.957 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:25.957 "strip_size_kb": 0, 00:34:25.957 "state": "online", 00:34:25.957 "raid_level": "raid1", 00:34:25.957 "superblock": true, 00:34:25.957 "num_base_bdevs": 2, 00:34:25.957 "num_base_bdevs_discovered": 2, 00:34:25.957 "num_base_bdevs_operational": 2, 00:34:25.957 "process": { 00:34:25.957 "type": "rebuild", 00:34:25.957 "target": "spare", 00:34:25.957 "progress": { 00:34:25.957 "blocks": 3072, 00:34:25.957 "percent": 38 00:34:25.957 } 00:34:25.957 }, 00:34:25.957 "base_bdevs_list": [ 00:34:25.957 { 00:34:25.957 "name": "spare", 00:34:25.957 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:25.957 "is_configured": true, 00:34:25.957 "data_offset": 256, 00:34:25.957 "data_size": 7936 00:34:25.957 }, 00:34:25.957 { 00:34:25.957 "name": "BaseBdev2", 00:34:25.957 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:25.957 "is_configured": true, 00:34:25.957 "data_offset": 256, 00:34:25.957 "data_size": 7936 00:34:25.957 } 00:34:25.957 ] 00:34:25.957 }' 00:34:25.957 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:25.957 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:25.957 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:25.957 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:25.957 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:34:26.525 [2024-07-23 17:28:21.857284] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:26.525 [2024-07-23 17:28:21.889757] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:26.525 [2024-07-23 17:28:21.889799] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:26.525 [2024-07-23 17:28:21.889814] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:26.526 [2024-07-23 17:28:21.889822] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:26.526 17:28:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:26.785 17:28:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:26.785 "name": "raid_bdev1", 00:34:26.785 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:26.785 "strip_size_kb": 0, 00:34:26.785 "state": "online", 00:34:26.785 "raid_level": "raid1", 00:34:26.785 "superblock": true, 00:34:26.785 "num_base_bdevs": 2, 00:34:26.785 "num_base_bdevs_discovered": 1, 00:34:26.785 "num_base_bdevs_operational": 1, 00:34:26.785 "base_bdevs_list": [ 00:34:26.785 { 00:34:26.785 "name": null, 00:34:26.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:26.785 "is_configured": false, 00:34:26.785 "data_offset": 256, 00:34:26.785 "data_size": 7936 00:34:26.785 }, 00:34:26.785 { 00:34:26.785 "name": "BaseBdev2", 00:34:26.785 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:26.785 "is_configured": true, 00:34:26.785 "data_offset": 256, 00:34:26.785 "data_size": 7936 00:34:26.785 } 00:34:26.785 ] 00:34:26.785 }' 00:34:26.785 17:28:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:26.785 17:28:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:27.722 17:28:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:34:28.291 [2024-07-23 17:28:23.505697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:34:28.291 [2024-07-23 17:28:23.505749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:28.291 [2024-07-23 17:28:23.505776] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf22070 00:34:28.291 [2024-07-23 17:28:23.505788] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:28.291 [2024-07-23 17:28:23.505976] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:28.291 [2024-07-23 17:28:23.505992] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:34:28.291 [2024-07-23 17:28:23.506045] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:34:28.291 [2024-07-23 17:28:23.506057] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:34:28.291 [2024-07-23 17:28:23.506068] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:34:28.291 [2024-07-23 17:28:23.506086] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:34:28.291 [2024-07-23 17:28:23.509462] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xda03c0 00:34:28.291 [2024-07-23 17:28:23.510911] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:34:28.291 spare 00:34:28.291 17:28:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:29.228 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:29.487 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:29.487 "name": "raid_bdev1", 00:34:29.487 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:29.487 "strip_size_kb": 0, 00:34:29.487 "state": "online", 00:34:29.487 "raid_level": "raid1", 00:34:29.487 "superblock": true, 00:34:29.487 "num_base_bdevs": 2, 00:34:29.487 "num_base_bdevs_discovered": 2, 00:34:29.487 "num_base_bdevs_operational": 2, 00:34:29.487 "process": { 00:34:29.487 "type": "rebuild", 00:34:29.487 "target": "spare", 00:34:29.487 "progress": { 00:34:29.487 "blocks": 3072, 00:34:29.487 "percent": 38 00:34:29.487 } 00:34:29.487 }, 00:34:29.487 "base_bdevs_list": [ 00:34:29.487 { 00:34:29.487 "name": "spare", 00:34:29.487 "uuid": "5c1ba900-99e7-593b-89fd-5fef66ba7d9a", 00:34:29.487 "is_configured": true, 00:34:29.487 "data_offset": 256, 00:34:29.487 "data_size": 7936 00:34:29.487 }, 00:34:29.487 { 00:34:29.487 "name": "BaseBdev2", 00:34:29.487 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:29.487 "is_configured": true, 00:34:29.487 "data_offset": 256, 00:34:29.487 "data_size": 7936 00:34:29.487 } 00:34:29.487 ] 00:34:29.487 }' 00:34:29.487 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:29.487 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:34:29.487 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:29.487 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:34:29.487 17:28:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:34:29.745 [2024-07-23 17:28:25.119944] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:29.745 [2024-07-23 17:28:25.123254] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:34:29.745 [2024-07-23 17:28:25.123297] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:34:29.745 [2024-07-23 17:28:25.123313] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:34:29.745 [2024-07-23 17:28:25.123321] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:29.745 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:30.003 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:30.003 "name": "raid_bdev1", 00:34:30.003 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:30.003 "strip_size_kb": 0, 00:34:30.003 "state": "online", 00:34:30.003 "raid_level": "raid1", 00:34:30.003 "superblock": true, 00:34:30.003 "num_base_bdevs": 2, 00:34:30.003 "num_base_bdevs_discovered": 1, 00:34:30.003 "num_base_bdevs_operational": 1, 00:34:30.003 "base_bdevs_list": [ 00:34:30.003 { 00:34:30.003 "name": null, 00:34:30.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:30.003 "is_configured": false, 00:34:30.003 "data_offset": 256, 00:34:30.003 "data_size": 7936 00:34:30.003 }, 00:34:30.003 { 00:34:30.003 "name": "BaseBdev2", 00:34:30.003 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:30.003 "is_configured": true, 00:34:30.003 "data_offset": 256, 00:34:30.003 "data_size": 7936 00:34:30.003 } 00:34:30.003 ] 00:34:30.003 }' 00:34:30.003 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:30.003 17:28:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:30.940 "name": "raid_bdev1", 00:34:30.940 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:30.940 "strip_size_kb": 0, 00:34:30.940 "state": "online", 00:34:30.940 "raid_level": "raid1", 00:34:30.940 "superblock": true, 00:34:30.940 "num_base_bdevs": 2, 00:34:30.940 "num_base_bdevs_discovered": 1, 00:34:30.940 "num_base_bdevs_operational": 1, 00:34:30.940 "base_bdevs_list": [ 00:34:30.940 { 00:34:30.940 "name": null, 00:34:30.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:30.940 "is_configured": false, 00:34:30.940 "data_offset": 256, 00:34:30.940 "data_size": 7936 00:34:30.940 }, 00:34:30.940 { 00:34:30.940 "name": "BaseBdev2", 00:34:30.940 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:30.940 "is_configured": true, 00:34:30.940 "data_offset": 256, 00:34:30.940 "data_size": 7936 00:34:30.940 } 00:34:30.940 ] 00:34:30.940 }' 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:30.940 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:34:31.199 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:34:31.458 [2024-07-23 17:28:26.755957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:34:31.458 [2024-07-23 17:28:26.756002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:31.458 [2024-07-23 17:28:26.756026] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf3b4c0 00:34:31.458 [2024-07-23 17:28:26.756038] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:31.458 [2024-07-23 17:28:26.756190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:31.458 [2024-07-23 17:28:26.756205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:34:31.458 [2024-07-23 17:28:26.756247] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:34:31.458 [2024-07-23 17:28:26.756258] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:34:31.458 [2024-07-23 17:28:26.756268] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:34:31.458 BaseBdev1 00:34:31.458 17:28:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:32.394 17:28:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:32.653 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:32.653 "name": "raid_bdev1", 00:34:32.653 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:32.653 "strip_size_kb": 0, 00:34:32.653 "state": "online", 00:34:32.653 "raid_level": "raid1", 00:34:32.653 "superblock": true, 00:34:32.653 "num_base_bdevs": 2, 00:34:32.653 "num_base_bdevs_discovered": 1, 00:34:32.653 "num_base_bdevs_operational": 1, 00:34:32.653 "base_bdevs_list": [ 00:34:32.653 { 00:34:32.653 "name": null, 00:34:32.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:32.653 "is_configured": false, 00:34:32.653 "data_offset": 256, 00:34:32.653 "data_size": 7936 00:34:32.653 }, 00:34:32.653 { 00:34:32.653 "name": "BaseBdev2", 00:34:32.653 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:32.653 "is_configured": true, 00:34:32.653 "data_offset": 256, 00:34:32.653 "data_size": 7936 00:34:32.653 } 00:34:32.653 ] 00:34:32.653 }' 00:34:32.653 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:32.653 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:33.589 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:33.589 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:33.589 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:33.589 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:33.589 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:33.590 "name": "raid_bdev1", 00:34:33.590 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:33.590 "strip_size_kb": 0, 00:34:33.590 "state": "online", 00:34:33.590 "raid_level": "raid1", 00:34:33.590 "superblock": true, 00:34:33.590 "num_base_bdevs": 2, 00:34:33.590 "num_base_bdevs_discovered": 1, 00:34:33.590 "num_base_bdevs_operational": 1, 00:34:33.590 "base_bdevs_list": [ 00:34:33.590 { 00:34:33.590 "name": null, 00:34:33.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:33.590 "is_configured": false, 00:34:33.590 "data_offset": 256, 00:34:33.590 "data_size": 7936 00:34:33.590 }, 00:34:33.590 { 00:34:33.590 "name": "BaseBdev2", 00:34:33.590 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:33.590 "is_configured": true, 00:34:33.590 "data_offset": 256, 00:34:33.590 "data_size": 7936 00:34:33.590 } 00:34:33.590 ] 00:34:33.590 }' 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:34:33.590 17:28:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:34:33.849 [2024-07-23 17:28:29.198503] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:34:33.849 [2024-07-23 17:28:29.198620] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:34:33.849 [2024-07-23 17:28:29.198635] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:34:33.849 request: 00:34:33.849 { 00:34:33.849 "base_bdev": "BaseBdev1", 00:34:33.849 "raid_bdev": "raid_bdev1", 00:34:33.849 "method": "bdev_raid_add_base_bdev", 00:34:33.849 "req_id": 1 00:34:33.849 } 00:34:33.849 Got JSON-RPC error response 00:34:33.849 response: 00:34:33.849 { 00:34:33.849 "code": -22, 00:34:33.849 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:34:33.849 } 00:34:33.849 17:28:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:34:33.849 17:28:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:34:33.849 17:28:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:34:33.849 17:28:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:34:33.849 17:28:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:35.263 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:34:35.263 "name": "raid_bdev1", 00:34:35.263 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:35.263 "strip_size_kb": 0, 00:34:35.263 "state": "online", 00:34:35.263 "raid_level": "raid1", 00:34:35.263 "superblock": true, 00:34:35.263 "num_base_bdevs": 2, 00:34:35.263 "num_base_bdevs_discovered": 1, 00:34:35.263 "num_base_bdevs_operational": 1, 00:34:35.263 "base_bdevs_list": [ 00:34:35.263 { 00:34:35.263 "name": null, 00:34:35.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:35.263 "is_configured": false, 00:34:35.263 "data_offset": 256, 00:34:35.263 "data_size": 7936 00:34:35.263 }, 00:34:35.263 { 00:34:35.263 "name": "BaseBdev2", 00:34:35.264 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:35.264 "is_configured": true, 00:34:35.264 "data_offset": 256, 00:34:35.264 "data_size": 7936 00:34:35.264 } 00:34:35.264 ] 00:34:35.264 }' 00:34:35.264 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:34:35.264 17:28:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:34:35.862 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:34:36.120 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:34:36.120 "name": "raid_bdev1", 00:34:36.120 "uuid": "6e2405c5-8de4-411c-9347-41772568f020", 00:34:36.120 "strip_size_kb": 0, 00:34:36.121 "state": "online", 00:34:36.121 "raid_level": "raid1", 00:34:36.121 "superblock": true, 00:34:36.121 "num_base_bdevs": 2, 00:34:36.121 "num_base_bdevs_discovered": 1, 00:34:36.121 "num_base_bdevs_operational": 1, 00:34:36.121 "base_bdevs_list": [ 00:34:36.121 { 00:34:36.121 "name": null, 00:34:36.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:34:36.121 "is_configured": false, 00:34:36.121 "data_offset": 256, 00:34:36.121 "data_size": 7936 00:34:36.121 }, 00:34:36.121 { 00:34:36.121 "name": "BaseBdev2", 00:34:36.121 "uuid": "d251f040-f96e-5543-a5bc-a406931a703e", 00:34:36.121 "is_configured": true, 00:34:36.121 "data_offset": 256, 00:34:36.121 "data_size": 7936 00:34:36.121 } 00:34:36.121 ] 00:34:36.121 }' 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 81631 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 81631 ']' 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 81631 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 81631 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 81631' 00:34:36.121 killing process with pid 81631 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 81631 00:34:36.121 Received shutdown signal, test time was about 60.000000 seconds 00:34:36.121 00:34:36.121 Latency(us) 00:34:36.121 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:36.121 =================================================================================================================== 00:34:36.121 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:34:36.121 [2024-07-23 17:28:31.480080] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:34:36.121 [2024-07-23 17:28:31.480167] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:34:36.121 [2024-07-23 17:28:31.480211] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:34:36.121 [2024-07-23 17:28:31.480223] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf26650 name raid_bdev1, state offline 00:34:36.121 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 81631 00:34:36.121 [2024-07-23 17:28:31.512877] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:34:36.380 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:34:36.380 00:34:36.380 real 0m33.457s 00:34:36.380 user 0m54.547s 00:34:36.380 sys 0m4.558s 00:34:36.380 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:36.380 17:28:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:34:36.380 ************************************ 00:34:36.380 END TEST raid_rebuild_test_sb_md_interleaved 00:34:36.380 ************************************ 00:34:36.380 17:28:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:34:36.380 17:28:31 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:34:36.380 17:28:31 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:34:36.380 17:28:31 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 81631 ']' 00:34:36.380 17:28:31 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 81631 00:34:36.638 17:28:31 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:34:36.638 00:34:36.638 real 19m41.774s 00:34:36.638 user 33m33.247s 00:34:36.638 sys 3m35.575s 00:34:36.638 17:28:31 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:36.638 17:28:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:34:36.638 ************************************ 00:34:36.638 END TEST bdev_raid 00:34:36.638 ************************************ 00:34:36.638 17:28:31 -- common/autotest_common.sh@1142 -- # return 0 00:34:36.638 17:28:31 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:34:36.638 17:28:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:36.638 17:28:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:36.638 17:28:31 -- common/autotest_common.sh@10 -- # set +x 00:34:36.638 ************************************ 00:34:36.638 START TEST bdevperf_config 00:34:36.638 ************************************ 00:34:36.638 17:28:31 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:34:36.638 * Looking for test storage... 00:34:36.638 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:36.638 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:36.638 17:28:32 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:36.639 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:36.639 17:28:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:36.897 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:36.897 17:28:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:36.898 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:36.898 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:36.898 17:28:32 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:39.431 17:28:34 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-23 17:28:32.137166] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:39.431 [2024-07-23 17:28:32.137237] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86353 ] 00:34:39.431 Using job config with 4 jobs 00:34:39.431 [2024-07-23 17:28:32.292416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:39.431 [2024-07-23 17:28:32.359034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:39.431 cpumask for '\''job0'\'' is too big 00:34:39.431 cpumask for '\''job1'\'' is too big 00:34:39.431 cpumask for '\''job2'\'' is too big 00:34:39.431 cpumask for '\''job3'\'' is too big 00:34:39.431 Running I/O for 2 seconds... 00:34:39.432 00:34:39.432 Latency(us) 00:34:39.432 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23709.62 23.15 0.00 0.00 10786.06 1866.35 16526.47 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23687.67 23.13 0.00 0.00 10773.04 1852.10 14588.88 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23665.92 23.11 0.00 0.00 10757.43 1852.10 12765.27 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23644.14 23.09 0.00 0.00 10742.96 1852.10 11055.64 00:34:39.432 =================================================================================================================== 00:34:39.432 Total : 94707.35 92.49 0.00 0.00 10764.87 1852.10 16526.47' 00:34:39.432 17:28:34 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-23 17:28:32.137166] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:39.432 [2024-07-23 17:28:32.137237] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86353 ] 00:34:39.432 Using job config with 4 jobs 00:34:39.432 [2024-07-23 17:28:32.292416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:39.432 [2024-07-23 17:28:32.359034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:39.432 cpumask for '\''job0'\'' is too big 00:34:39.432 cpumask for '\''job1'\'' is too big 00:34:39.432 cpumask for '\''job2'\'' is too big 00:34:39.432 cpumask for '\''job3'\'' is too big 00:34:39.432 Running I/O for 2 seconds... 00:34:39.432 00:34:39.432 Latency(us) 00:34:39.432 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23709.62 23.15 0.00 0.00 10786.06 1866.35 16526.47 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23687.67 23.13 0.00 0.00 10773.04 1852.10 14588.88 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23665.92 23.11 0.00 0.00 10757.43 1852.10 12765.27 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23644.14 23.09 0.00 0.00 10742.96 1852.10 11055.64 00:34:39.432 =================================================================================================================== 00:34:39.432 Total : 94707.35 92.49 0.00 0.00 10764.87 1852.10 16526.47' 00:34:39.432 17:28:34 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-23 17:28:32.137166] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:39.432 [2024-07-23 17:28:32.137237] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86353 ] 00:34:39.432 Using job config with 4 jobs 00:34:39.432 [2024-07-23 17:28:32.292416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:39.432 [2024-07-23 17:28:32.359034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:39.432 cpumask for '\''job0'\'' is too big 00:34:39.432 cpumask for '\''job1'\'' is too big 00:34:39.432 cpumask for '\''job2'\'' is too big 00:34:39.432 cpumask for '\''job3'\'' is too big 00:34:39.432 Running I/O for 2 seconds... 00:34:39.432 00:34:39.432 Latency(us) 00:34:39.432 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23709.62 23.15 0.00 0.00 10786.06 1866.35 16526.47 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23687.67 23.13 0.00 0.00 10773.04 1852.10 14588.88 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23665.92 23.11 0.00 0.00 10757.43 1852.10 12765.27 00:34:39.432 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:39.432 Malloc0 : 2.02 23644.14 23.09 0.00 0.00 10742.96 1852.10 11055.64 00:34:39.432 =================================================================================================================== 00:34:39.432 Total : 94707.35 92.49 0.00 0.00 10764.87 1852.10 16526.47' 00:34:39.432 17:28:34 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:34:39.432 17:28:34 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:34:39.432 17:28:34 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:34:39.432 17:28:34 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:39.432 [2024-07-23 17:28:34.807748] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:39.432 [2024-07-23 17:28:34.807815] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86708 ] 00:34:39.691 [2024-07-23 17:28:34.952304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:39.691 [2024-07-23 17:28:35.019495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:39.950 cpumask for 'job0' is too big 00:34:39.950 cpumask for 'job1' is too big 00:34:39.950 cpumask for 'job2' is too big 00:34:39.950 cpumask for 'job3' is too big 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:34:42.487 Running I/O for 2 seconds... 00:34:42.487 00:34:42.487 Latency(us) 00:34:42.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:42.487 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:42.487 Malloc0 : 2.02 23826.49 23.27 0.00 0.00 10735.83 1866.35 16412.49 00:34:42.487 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:42.487 Malloc0 : 2.02 23804.46 23.25 0.00 0.00 10721.72 1837.86 14588.88 00:34:42.487 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:42.487 Malloc0 : 2.02 23782.57 23.23 0.00 0.00 10707.39 1837.86 12765.27 00:34:42.487 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:34:42.487 Malloc0 : 2.03 23760.77 23.20 0.00 0.00 10692.67 1837.86 10998.65 00:34:42.487 =================================================================================================================== 00:34:42.487 Total : 95174.28 92.94 0.00 0.00 10714.40 1837.86 16412.49' 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:42.487 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:42.487 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:42.487 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:42.487 17:28:37 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:45.022 17:28:40 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-23 17:28:37.491633] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:45.022 [2024-07-23 17:28:37.491701] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87056 ] 00:34:45.022 Using job config with 3 jobs 00:34:45.022 [2024-07-23 17:28:37.634447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:45.022 [2024-07-23 17:28:37.709970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:45.023 cpumask for '\''job0'\'' is too big 00:34:45.023 cpumask for '\''job1'\'' is too big 00:34:45.023 cpumask for '\''job2'\'' is too big 00:34:45.023 Running I/O for 2 seconds... 00:34:45.023 00:34:45.023 Latency(us) 00:34:45.023 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.01 32043.92 31.29 0.00 0.00 7978.46 1816.49 11625.52 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.02 32013.97 31.26 0.00 0.00 7967.42 1802.24 9858.89 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.02 31984.26 31.23 0.00 0.00 7957.12 1802.24 8206.25 00:34:45.023 =================================================================================================================== 00:34:45.023 Total : 96042.15 93.79 0.00 0.00 7967.67 1802.24 11625.52' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-23 17:28:37.491633] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:45.023 [2024-07-23 17:28:37.491701] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87056 ] 00:34:45.023 Using job config with 3 jobs 00:34:45.023 [2024-07-23 17:28:37.634447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:45.023 [2024-07-23 17:28:37.709970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:45.023 cpumask for '\''job0'\'' is too big 00:34:45.023 cpumask for '\''job1'\'' is too big 00:34:45.023 cpumask for '\''job2'\'' is too big 00:34:45.023 Running I/O for 2 seconds... 00:34:45.023 00:34:45.023 Latency(us) 00:34:45.023 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.01 32043.92 31.29 0.00 0.00 7978.46 1816.49 11625.52 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.02 32013.97 31.26 0.00 0.00 7967.42 1802.24 9858.89 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.02 31984.26 31.23 0.00 0.00 7957.12 1802.24 8206.25 00:34:45.023 =================================================================================================================== 00:34:45.023 Total : 96042.15 93.79 0.00 0.00 7967.67 1802.24 11625.52' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-23 17:28:37.491633] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:45.023 [2024-07-23 17:28:37.491701] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87056 ] 00:34:45.023 Using job config with 3 jobs 00:34:45.023 [2024-07-23 17:28:37.634447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:45.023 [2024-07-23 17:28:37.709970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:45.023 cpumask for '\''job0'\'' is too big 00:34:45.023 cpumask for '\''job1'\'' is too big 00:34:45.023 cpumask for '\''job2'\'' is too big 00:34:45.023 Running I/O for 2 seconds... 00:34:45.023 00:34:45.023 Latency(us) 00:34:45.023 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.01 32043.92 31.29 0.00 0.00 7978.46 1816.49 11625.52 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.02 32013.97 31.26 0.00 0.00 7967.42 1802.24 9858.89 00:34:45.023 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:34:45.023 Malloc0 : 2.02 31984.26 31.23 0.00 0.00 7957.12 1802.24 8206.25 00:34:45.023 =================================================================================================================== 00:34:45.023 Total : 96042.15 93.79 0.00 0.00 7967.67 1802.24 11625.52' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:45.023 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:45.023 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:45.023 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:45.023 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:34:45.023 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:34:45.023 17:28:40 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:47.557 17:28:42 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-23 17:28:40.223206] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:47.557 [2024-07-23 17:28:40.223275] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87418 ] 00:34:47.557 Using job config with 4 jobs 00:34:47.557 [2024-07-23 17:28:40.369205] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.557 [2024-07-23 17:28:40.437943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.557 cpumask for '\''job0'\'' is too big 00:34:47.557 cpumask for '\''job1'\'' is too big 00:34:47.557 cpumask for '\''job2'\'' is too big 00:34:47.557 cpumask for '\''job3'\'' is too big 00:34:47.557 Running I/O for 2 seconds... 00:34:47.557 00:34:47.557 Latency(us) 00:34:47.557 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:47.557 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc0 : 2.03 11856.54 11.58 0.00 0.00 21573.25 3818.18 33280.89 00:34:47.557 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc1 : 2.03 11845.38 11.57 0.00 0.00 21576.44 4673.00 33280.89 00:34:47.557 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc0 : 2.03 11834.49 11.56 0.00 0.00 21516.88 3789.69 29405.72 00:34:47.557 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc1 : 2.04 11823.46 11.55 0.00 0.00 21517.84 4644.51 29405.72 00:34:47.557 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc0 : 2.05 11874.70 11.60 0.00 0.00 21349.66 3761.20 25644.52 00:34:47.557 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc1 : 2.05 11863.56 11.59 0.00 0.00 21348.57 4673.00 25530.55 00:34:47.557 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc0 : 2.05 11852.73 11.57 0.00 0.00 21289.13 3789.69 21883.33 00:34:47.557 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.557 Malloc1 : 2.05 11841.74 11.56 0.00 0.00 21286.89 4644.51 21883.33 00:34:47.557 =================================================================================================================== 00:34:47.557 Total : 94792.60 92.57 0.00 0.00 21431.73 3761.20 33280.89' 00:34:47.557 17:28:42 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-23 17:28:40.223206] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:47.557 [2024-07-23 17:28:40.223275] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87418 ] 00:34:47.557 Using job config with 4 jobs 00:34:47.557 [2024-07-23 17:28:40.369205] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.557 [2024-07-23 17:28:40.437943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.557 cpumask for '\''job0'\'' is too big 00:34:47.557 cpumask for '\''job1'\'' is too big 00:34:47.557 cpumask for '\''job2'\'' is too big 00:34:47.558 cpumask for '\''job3'\'' is too big 00:34:47.558 Running I/O for 2 seconds... 00:34:47.558 00:34:47.558 Latency(us) 00:34:47.558 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.03 11856.54 11.58 0.00 0.00 21573.25 3818.18 33280.89 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.03 11845.38 11.57 0.00 0.00 21576.44 4673.00 33280.89 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.03 11834.49 11.56 0.00 0.00 21516.88 3789.69 29405.72 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.04 11823.46 11.55 0.00 0.00 21517.84 4644.51 29405.72 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.05 11874.70 11.60 0.00 0.00 21349.66 3761.20 25644.52 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.05 11863.56 11.59 0.00 0.00 21348.57 4673.00 25530.55 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.05 11852.73 11.57 0.00 0.00 21289.13 3789.69 21883.33 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.05 11841.74 11.56 0.00 0.00 21286.89 4644.51 21883.33 00:34:47.558 =================================================================================================================== 00:34:47.558 Total : 94792.60 92.57 0.00 0.00 21431.73 3761.20 33280.89' 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-23 17:28:40.223206] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:47.558 [2024-07-23 17:28:40.223275] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87418 ] 00:34:47.558 Using job config with 4 jobs 00:34:47.558 [2024-07-23 17:28:40.369205] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.558 [2024-07-23 17:28:40.437943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.558 cpumask for '\''job0'\'' is too big 00:34:47.558 cpumask for '\''job1'\'' is too big 00:34:47.558 cpumask for '\''job2'\'' is too big 00:34:47.558 cpumask for '\''job3'\'' is too big 00:34:47.558 Running I/O for 2 seconds... 00:34:47.558 00:34:47.558 Latency(us) 00:34:47.558 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.03 11856.54 11.58 0.00 0.00 21573.25 3818.18 33280.89 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.03 11845.38 11.57 0.00 0.00 21576.44 4673.00 33280.89 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.03 11834.49 11.56 0.00 0.00 21516.88 3789.69 29405.72 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.04 11823.46 11.55 0.00 0.00 21517.84 4644.51 29405.72 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.05 11874.70 11.60 0.00 0.00 21349.66 3761.20 25644.52 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.05 11863.56 11.59 0.00 0.00 21348.57 4673.00 25530.55 00:34:47.558 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc0 : 2.05 11852.73 11.57 0.00 0.00 21289.13 3789.69 21883.33 00:34:47.558 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:34:47.558 Malloc1 : 2.05 11841.74 11.56 0.00 0.00 21286.89 4644.51 21883.33 00:34:47.558 =================================================================================================================== 00:34:47.558 Total : 94792.60 92.57 0.00 0.00 21431.73 3761.20 33280.89' 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:34:47.558 17:28:42 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:34:47.558 00:34:47.558 real 0m10.982s 00:34:47.558 user 0m9.621s 00:34:47.558 sys 0m1.208s 00:34:47.558 17:28:42 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:47.558 17:28:42 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:34:47.558 ************************************ 00:34:47.558 END TEST bdevperf_config 00:34:47.558 ************************************ 00:34:47.558 17:28:42 -- common/autotest_common.sh@1142 -- # return 0 00:34:47.558 17:28:42 -- spdk/autotest.sh@192 -- # uname -s 00:34:47.558 17:28:42 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:34:47.558 17:28:42 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:34:47.558 17:28:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:47.558 17:28:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:47.558 17:28:42 -- common/autotest_common.sh@10 -- # set +x 00:34:47.819 ************************************ 00:34:47.819 START TEST reactor_set_interrupt 00:34:47.819 ************************************ 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:34:47.819 * Looking for test storage... 00:34:47.819 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:47.819 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:34:47.819 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:34:47.819 17:28:43 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:34:47.820 17:28:43 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:34:47.820 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:34:47.820 17:28:43 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:34:47.820 #define SPDK_CONFIG_H 00:34:47.820 #define SPDK_CONFIG_APPS 1 00:34:47.820 #define SPDK_CONFIG_ARCH native 00:34:47.820 #undef SPDK_CONFIG_ASAN 00:34:47.820 #undef SPDK_CONFIG_AVAHI 00:34:47.820 #undef SPDK_CONFIG_CET 00:34:47.820 #define SPDK_CONFIG_COVERAGE 1 00:34:47.820 #define SPDK_CONFIG_CROSS_PREFIX 00:34:47.820 #define SPDK_CONFIG_CRYPTO 1 00:34:47.820 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:34:47.820 #undef SPDK_CONFIG_CUSTOMOCF 00:34:47.820 #undef SPDK_CONFIG_DAOS 00:34:47.820 #define SPDK_CONFIG_DAOS_DIR 00:34:47.820 #define SPDK_CONFIG_DEBUG 1 00:34:47.820 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:34:47.820 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:34:47.820 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:34:47.820 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:47.820 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:34:47.820 #undef SPDK_CONFIG_DPDK_UADK 00:34:47.820 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:34:47.820 #define SPDK_CONFIG_EXAMPLES 1 00:34:47.820 #undef SPDK_CONFIG_FC 00:34:47.820 #define SPDK_CONFIG_FC_PATH 00:34:47.820 #define SPDK_CONFIG_FIO_PLUGIN 1 00:34:47.820 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:34:47.820 #undef SPDK_CONFIG_FUSE 00:34:47.820 #undef SPDK_CONFIG_FUZZER 00:34:47.820 #define SPDK_CONFIG_FUZZER_LIB 00:34:47.820 #undef SPDK_CONFIG_GOLANG 00:34:47.820 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:34:47.820 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:34:47.820 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:34:47.820 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:34:47.820 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:34:47.820 #undef SPDK_CONFIG_HAVE_LIBBSD 00:34:47.820 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:34:47.820 #define SPDK_CONFIG_IDXD 1 00:34:47.820 #define SPDK_CONFIG_IDXD_KERNEL 1 00:34:47.820 #define SPDK_CONFIG_IPSEC_MB 1 00:34:47.820 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:34:47.820 #define SPDK_CONFIG_ISAL 1 00:34:47.820 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:34:47.820 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:34:47.820 #define SPDK_CONFIG_LIBDIR 00:34:47.820 #undef SPDK_CONFIG_LTO 00:34:47.820 #define SPDK_CONFIG_MAX_LCORES 128 00:34:47.820 #define SPDK_CONFIG_NVME_CUSE 1 00:34:47.820 #undef SPDK_CONFIG_OCF 00:34:47.820 #define SPDK_CONFIG_OCF_PATH 00:34:47.820 #define SPDK_CONFIG_OPENSSL_PATH 00:34:47.820 #undef SPDK_CONFIG_PGO_CAPTURE 00:34:47.820 #define SPDK_CONFIG_PGO_DIR 00:34:47.820 #undef SPDK_CONFIG_PGO_USE 00:34:47.820 #define SPDK_CONFIG_PREFIX /usr/local 00:34:47.820 #undef SPDK_CONFIG_RAID5F 00:34:47.820 #undef SPDK_CONFIG_RBD 00:34:47.820 #define SPDK_CONFIG_RDMA 1 00:34:47.820 #define SPDK_CONFIG_RDMA_PROV verbs 00:34:47.820 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:34:47.820 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:34:47.820 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:34:47.821 #define SPDK_CONFIG_SHARED 1 00:34:47.821 #undef SPDK_CONFIG_SMA 00:34:47.821 #define SPDK_CONFIG_TESTS 1 00:34:47.821 #undef SPDK_CONFIG_TSAN 00:34:47.821 #define SPDK_CONFIG_UBLK 1 00:34:47.821 #define SPDK_CONFIG_UBSAN 1 00:34:47.821 #undef SPDK_CONFIG_UNIT_TESTS 00:34:47.821 #undef SPDK_CONFIG_URING 00:34:47.821 #define SPDK_CONFIG_URING_PATH 00:34:47.821 #undef SPDK_CONFIG_URING_ZNS 00:34:47.821 #undef SPDK_CONFIG_USDT 00:34:47.821 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:34:47.821 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:34:47.821 #undef SPDK_CONFIG_VFIO_USER 00:34:47.821 #define SPDK_CONFIG_VFIO_USER_DIR 00:34:47.821 #define SPDK_CONFIG_VHOST 1 00:34:47.821 #define SPDK_CONFIG_VIRTIO 1 00:34:47.821 #undef SPDK_CONFIG_VTUNE 00:34:47.821 #define SPDK_CONFIG_VTUNE_DIR 00:34:47.821 #define SPDK_CONFIG_WERROR 1 00:34:47.821 #define SPDK_CONFIG_WPDK_DIR 00:34:47.821 #undef SPDK_CONFIG_XNVME 00:34:47.821 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:47.821 17:28:43 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:47.821 17:28:43 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.821 17:28:43 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.821 17:28:43 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.821 17:28:43 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:34:47.821 17:28:43 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:34:47.821 17:28:43 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 1 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:34:47.821 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : v22.11.4 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:34:47.822 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 87812 ]] 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 87812 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.mH280b 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.mH280b/tests/interrupt /tmp/spdk.mH280b 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:34:47.823 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:34:48.083 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=82180730880 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508527616 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=12327796736 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249551360 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254261760 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18890878976 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901708800 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=10829824 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253311488 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254265856 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=954368 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:34:48.084 * Looking for test storage... 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=82180730880 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=14542389248 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:48.084 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:34:48.084 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:34:48.084 17:28:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=87853 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:34:48.085 17:28:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 87853 /var/tmp/spdk.sock 00:34:48.085 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 87853 ']' 00:34:48.085 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:48.085 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:48.085 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:48.085 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:48.085 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:48.085 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:34:48.085 [2024-07-23 17:28:43.316615] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:48.085 [2024-07-23 17:28:43.316685] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87853 ] 00:34:48.085 [2024-07-23 17:28:43.448755] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:48.085 [2024-07-23 17:28:43.500798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:48.085 [2024-07-23 17:28:43.500920] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:48.085 [2024-07-23 17:28:43.500921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:48.344 [2024-07-23 17:28:43.565720] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:34:48.344 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:48.344 17:28:43 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:34:48.344 17:28:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:34:48.344 17:28:43 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:48.603 Malloc0 00:34:48.603 Malloc1 00:34:48.603 Malloc2 00:34:48.603 17:28:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:34:48.603 17:28:43 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:34:48.603 17:28:43 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:34:48.603 17:28:43 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:34:48.603 5000+0 records in 00:34:48.603 5000+0 records out 00:34:48.603 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0238136 s, 430 MB/s 00:34:48.603 17:28:43 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:34:48.861 AIO0 00:34:48.861 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 87853 00:34:48.861 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 87853 without_thd 00:34:48.861 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=87853 00:34:48.861 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:34:48.861 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:34:48.861 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:34:48.862 17:28:44 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:34:48.862 17:28:44 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:34:48.862 17:28:44 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:34:48.862 17:28:44 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:48.862 17:28:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:48.862 17:28:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:34:49.121 17:28:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:49.379 17:28:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:34:49.379 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:34:49.380 spdk_thread ids are 1 on reactor0. 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 87853 0 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 87853 0 idle 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:49.380 17:28:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87853 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.34 reactor_0' 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87853 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.34 reactor_0 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 87853 1 00:34:49.638 17:28:44 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 87853 1 idle 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:34:49.639 17:28:44 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:49.639 17:28:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87856 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_1' 00:34:49.639 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87856 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_1 00:34:49.639 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:49.639 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 87853 2 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 87853 2 idle 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87857 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_2' 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87857 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_2 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:34:49.898 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:34:50.157 [2024-07-23 17:28:45.474193] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:34:50.157 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:34:50.415 [2024-07-23 17:28:45.741726] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:34:50.415 [2024-07-23 17:28:45.742139] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:50.415 17:28:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:34:50.674 [2024-07-23 17:28:45.997629] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:34:50.674 [2024-07-23 17:28:45.997796] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 87853 0 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 87853 0 busy 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:50.674 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87853 root 20 0 128.2g 34560 21888 R 99.9 0.0 0:00.80 reactor_0' 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87853 root 20 0 128.2g 34560 21888 R 99.9 0.0 0:00.80 reactor_0 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 87853 2 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 87853 2 busy 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:50.933 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87857 root 20 0 128.2g 34560 21888 R 93.8 0.0 0:00.36 reactor_2' 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87857 root 20 0 128.2g 34560 21888 R 93.8 0.0 0:00.36 reactor_2 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:51.191 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:34:51.450 [2024-07-23 17:28:46.613619] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:34:51.450 [2024-07-23 17:28:46.613750] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 87853 2 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 87853 2 idle 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87857 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.61 reactor_2' 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87857 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.61 reactor_2 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:51.450 17:28:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:34:52.017 [2024-07-23 17:28:47.309623] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:34:52.017 [2024-07-23 17:28:47.309921] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:52.017 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:34:52.017 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:34:52.017 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:34:52.275 [2024-07-23 17:28:47.570015] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 87853 0 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 87853 0 idle 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=87853 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 87853 -w 256 00:34:52.275 17:28:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 87853 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:01.91 reactor_0' 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 87853 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:01.91 reactor_0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:34:52.534 17:28:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 87853 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 87853 ']' 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 87853 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 87853 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 87853' 00:34:52.534 killing process with pid 87853 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 87853 00:34:52.534 17:28:47 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 87853 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=88542 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:52.793 17:28:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:34:52.794 17:28:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 88542 /var/tmp/spdk.sock 00:34:52.794 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 88542 ']' 00:34:52.794 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:52.794 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:52.794 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:52.794 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:52.794 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:52.794 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:34:52.794 [2024-07-23 17:28:48.112097] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:52.794 [2024-07-23 17:28:48.112166] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid88542 ] 00:34:53.052 [2024-07-23 17:28:48.242294] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:53.052 [2024-07-23 17:28:48.293923] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:53.052 [2024-07-23 17:28:48.294003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:53.052 [2024-07-23 17:28:48.294005] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:53.052 [2024-07-23 17:28:48.358687] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:34:53.052 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:53.052 17:28:48 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:34:53.052 17:28:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:34:53.052 17:28:48 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:53.621 Malloc0 00:34:53.621 Malloc1 00:34:53.621 Malloc2 00:34:53.621 17:28:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:34:53.621 17:28:48 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:34:53.621 17:28:48 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:34:53.621 17:28:48 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:34:53.621 5000+0 records in 00:34:53.621 5000+0 records out 00:34:53.621 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0263086 s, 389 MB/s 00:34:53.621 17:28:48 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:34:53.880 AIO0 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 88542 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 88542 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=88542 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:34:53.880 17:28:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:34:54.140 17:28:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:34:54.399 spdk_thread ids are 1 on reactor0. 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 88542 0 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 88542 0 idle 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:54.399 17:28:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88542 root 20 0 128.2g 34560 21888 R 0.0 0.0 0:00.33 reactor_0' 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88542 root 20 0 128.2g 34560 21888 R 0.0 0.0 0:00.33 reactor_0 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 88542 1 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 88542 1 idle 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:54.658 17:28:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88587 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_1' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88587 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_1 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 88542 2 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 88542 2 idle 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88589 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_2' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88589 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.00 reactor_2 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:34:54.918 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:34:55.186 [2024-07-23 17:28:50.534741] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:34:55.186 [2024-07-23 17:28:50.534988] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:34:55.186 [2024-07-23 17:28:50.535157] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:55.186 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:34:55.501 [2024-07-23 17:28:50.791275] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:34:55.501 [2024-07-23 17:28:50.791492] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 88542 0 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 88542 0 busy 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:55.501 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88542 root 20 0 128.2g 34560 21888 R 99.9 0.0 0:00.79 reactor_0' 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88542 root 20 0 128.2g 34560 21888 R 99.9 0.0 0:00.79 reactor_0 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 88542 2 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 88542 2 busy 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:55.760 17:28:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88589 root 20 0 128.2g 34560 21888 R 99.9 0.0 0:00.36 reactor_2' 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88589 root 20 0 128.2g 34560 21888 R 99.9 0.0 0:00.36 reactor_2 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:55.760 17:28:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:34:56.019 [2024-07-23 17:28:51.409038] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:34:56.019 [2024-07-23 17:28:51.409164] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 88542 2 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 88542 2 idle 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:56.019 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88589 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.61 reactor_2' 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88589 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:00.61 reactor_2 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:56.278 17:28:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:34:56.537 [2024-07-23 17:28:51.842165] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:34:56.537 [2024-07-23 17:28:51.845967] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:34:56.537 [2024-07-23 17:28:51.845992] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 88542 0 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 88542 0 idle 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=88542 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 88542 -w 256 00:34:56.537 17:28:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 88542 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:01.65 reactor_0' 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 88542 root 20 0 128.2g 34560 21888 S 0.0 0.0 0:01.65 reactor_0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:34:56.797 17:28:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 88542 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 88542 ']' 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 88542 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 88542 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 88542' 00:34:56.797 killing process with pid 88542 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 88542 00:34:56.797 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 88542 00:34:57.056 17:28:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:34:57.056 17:28:52 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:34:57.056 00:34:57.056 real 0m9.339s 00:34:57.056 user 0m9.883s 00:34:57.056 sys 0m2.166s 00:34:57.056 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:57.056 17:28:52 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:34:57.056 ************************************ 00:34:57.056 END TEST reactor_set_interrupt 00:34:57.056 ************************************ 00:34:57.056 17:28:52 -- common/autotest_common.sh@1142 -- # return 0 00:34:57.056 17:28:52 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:34:57.056 17:28:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:57.056 17:28:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:57.056 17:28:52 -- common/autotest_common.sh@10 -- # set +x 00:34:57.056 ************************************ 00:34:57.056 START TEST reap_unregistered_poller 00:34:57.056 ************************************ 00:34:57.056 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:34:57.317 * Looking for test storage... 00:34:57.317 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:57.317 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:34:57.317 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:34:57.318 17:28:52 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:34:57.318 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:34:57.318 17:28:52 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:34:57.318 #define SPDK_CONFIG_H 00:34:57.318 #define SPDK_CONFIG_APPS 1 00:34:57.318 #define SPDK_CONFIG_ARCH native 00:34:57.318 #undef SPDK_CONFIG_ASAN 00:34:57.318 #undef SPDK_CONFIG_AVAHI 00:34:57.318 #undef SPDK_CONFIG_CET 00:34:57.318 #define SPDK_CONFIG_COVERAGE 1 00:34:57.318 #define SPDK_CONFIG_CROSS_PREFIX 00:34:57.318 #define SPDK_CONFIG_CRYPTO 1 00:34:57.318 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:34:57.318 #undef SPDK_CONFIG_CUSTOMOCF 00:34:57.318 #undef SPDK_CONFIG_DAOS 00:34:57.318 #define SPDK_CONFIG_DAOS_DIR 00:34:57.319 #define SPDK_CONFIG_DEBUG 1 00:34:57.319 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:34:57.319 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:34:57.319 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/crypto-phy-autotest/dpdk/build/include 00:34:57.319 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:57.319 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:34:57.319 #undef SPDK_CONFIG_DPDK_UADK 00:34:57.319 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:34:57.319 #define SPDK_CONFIG_EXAMPLES 1 00:34:57.319 #undef SPDK_CONFIG_FC 00:34:57.319 #define SPDK_CONFIG_FC_PATH 00:34:57.319 #define SPDK_CONFIG_FIO_PLUGIN 1 00:34:57.319 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:34:57.319 #undef SPDK_CONFIG_FUSE 00:34:57.319 #undef SPDK_CONFIG_FUZZER 00:34:57.319 #define SPDK_CONFIG_FUZZER_LIB 00:34:57.319 #undef SPDK_CONFIG_GOLANG 00:34:57.319 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:34:57.319 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:34:57.319 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:34:57.319 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:34:57.319 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:34:57.319 #undef SPDK_CONFIG_HAVE_LIBBSD 00:34:57.319 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:34:57.319 #define SPDK_CONFIG_IDXD 1 00:34:57.319 #define SPDK_CONFIG_IDXD_KERNEL 1 00:34:57.319 #define SPDK_CONFIG_IPSEC_MB 1 00:34:57.319 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib 00:34:57.319 #define SPDK_CONFIG_ISAL 1 00:34:57.319 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:34:57.319 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:34:57.319 #define SPDK_CONFIG_LIBDIR 00:34:57.319 #undef SPDK_CONFIG_LTO 00:34:57.319 #define SPDK_CONFIG_MAX_LCORES 128 00:34:57.319 #define SPDK_CONFIG_NVME_CUSE 1 00:34:57.319 #undef SPDK_CONFIG_OCF 00:34:57.319 #define SPDK_CONFIG_OCF_PATH 00:34:57.319 #define SPDK_CONFIG_OPENSSL_PATH 00:34:57.319 #undef SPDK_CONFIG_PGO_CAPTURE 00:34:57.319 #define SPDK_CONFIG_PGO_DIR 00:34:57.319 #undef SPDK_CONFIG_PGO_USE 00:34:57.319 #define SPDK_CONFIG_PREFIX /usr/local 00:34:57.319 #undef SPDK_CONFIG_RAID5F 00:34:57.319 #undef SPDK_CONFIG_RBD 00:34:57.319 #define SPDK_CONFIG_RDMA 1 00:34:57.319 #define SPDK_CONFIG_RDMA_PROV verbs 00:34:57.319 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:34:57.319 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:34:57.319 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:34:57.319 #define SPDK_CONFIG_SHARED 1 00:34:57.319 #undef SPDK_CONFIG_SMA 00:34:57.319 #define SPDK_CONFIG_TESTS 1 00:34:57.319 #undef SPDK_CONFIG_TSAN 00:34:57.319 #define SPDK_CONFIG_UBLK 1 00:34:57.319 #define SPDK_CONFIG_UBSAN 1 00:34:57.319 #undef SPDK_CONFIG_UNIT_TESTS 00:34:57.319 #undef SPDK_CONFIG_URING 00:34:57.319 #define SPDK_CONFIG_URING_PATH 00:34:57.319 #undef SPDK_CONFIG_URING_ZNS 00:34:57.319 #undef SPDK_CONFIG_USDT 00:34:57.319 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:34:57.319 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:34:57.319 #undef SPDK_CONFIG_VFIO_USER 00:34:57.319 #define SPDK_CONFIG_VFIO_USER_DIR 00:34:57.319 #define SPDK_CONFIG_VHOST 1 00:34:57.319 #define SPDK_CONFIG_VIRTIO 1 00:34:57.319 #undef SPDK_CONFIG_VTUNE 00:34:57.319 #define SPDK_CONFIG_VTUNE_DIR 00:34:57.319 #define SPDK_CONFIG_WERROR 1 00:34:57.319 #define SPDK_CONFIG_WPDK_DIR 00:34:57.319 #undef SPDK_CONFIG_XNVME 00:34:57.319 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:57.319 17:28:52 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:57.319 17:28:52 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:57.319 17:28:52 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:57.319 17:28:52 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:57.319 17:28:52 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:34:57.319 17:28:52 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:34:57.319 17:28:52 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 1 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:34:57.319 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : v22.11.4 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/intel-ipsec-mb/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/isa-l/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:34:57.320 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 89247 ]] 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 89247 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.TKyw0Q 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.TKyw0Q/tests/interrupt /tmp/spdk.TKyw0Q 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=82180583424 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508527616 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=12327944192 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249551360 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254261760 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18890878976 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901708800 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=10829824 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253311488 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254265856 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=954368 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:57.321 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:34:57.322 * Looking for test storage... 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=82180583424 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=14542536704 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.322 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:34:57.322 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:34:57.322 17:28:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=89288 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:34:57.581 17:28:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 89288 /var/tmp/spdk.sock 00:34:57.581 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 89288 ']' 00:34:57.581 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:57.581 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:57.581 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:57.581 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:57.581 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:57.581 17:28:52 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:34:57.581 [2024-07-23 17:28:52.779464] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:57.581 [2024-07-23 17:28:52.779533] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89288 ] 00:34:57.581 [2024-07-23 17:28:52.909164] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:57.581 [2024-07-23 17:28:52.961302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:57.581 [2024-07-23 17:28:52.964935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:57.581 [2024-07-23 17:28:52.964939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:57.841 [2024-07-23 17:28:53.031241] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:34:57.841 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:57.841 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:34:57.841 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.841 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:34:57.841 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:34:57.841 "name": "app_thread", 00:34:57.841 "id": 1, 00:34:57.841 "active_pollers": [], 00:34:57.841 "timed_pollers": [ 00:34:57.841 { 00:34:57.841 "name": "rpc_subsystem_poll_servers", 00:34:57.841 "id": 1, 00:34:57.841 "state": "waiting", 00:34:57.841 "run_count": 0, 00:34:57.841 "busy_count": 0, 00:34:57.841 "period_ticks": 9200000 00:34:57.841 } 00:34:57.841 ], 00:34:57.841 "paused_pollers": [] 00:34:57.841 }' 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:34:57.841 5000+0 records in 00:34:57.841 5000+0 records out 00:34:57.841 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0201776 s, 507 MB/s 00:34:57.841 17:28:53 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:34:58.100 AIO0 00:34:58.100 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:58.358 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:34:58.618 "name": "app_thread", 00:34:58.618 "id": 1, 00:34:58.618 "active_pollers": [], 00:34:58.618 "timed_pollers": [ 00:34:58.618 { 00:34:58.618 "name": "rpc_subsystem_poll_servers", 00:34:58.618 "id": 1, 00:34:58.618 "state": "waiting", 00:34:58.618 "run_count": 0, 00:34:58.618 "busy_count": 0, 00:34:58.618 "period_ticks": 9200000 00:34:58.618 } 00:34:58.618 ], 00:34:58.618 "paused_pollers": [] 00:34:58.618 }' 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:34:58.618 17:28:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 89288 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 89288 ']' 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 89288 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:58.618 17:28:53 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 89288 00:34:58.618 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:58.618 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:58.618 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 89288' 00:34:58.618 killing process with pid 89288 00:34:58.618 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 89288 00:34:58.618 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 89288 00:34:58.877 17:28:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:34:58.877 17:28:54 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:34:58.877 00:34:58.877 real 0m1.821s 00:34:58.877 user 0m1.296s 00:34:58.877 sys 0m0.661s 00:34:58.877 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:58.877 17:28:54 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:34:58.877 ************************************ 00:34:58.877 END TEST reap_unregistered_poller 00:34:58.877 ************************************ 00:34:59.136 17:28:54 -- common/autotest_common.sh@1142 -- # return 0 00:34:59.136 17:28:54 -- spdk/autotest.sh@198 -- # uname -s 00:34:59.136 17:28:54 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:34:59.136 17:28:54 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:34:59.136 17:28:54 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:34:59.136 17:28:54 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@260 -- # timing_exit lib 00:34:59.136 17:28:54 -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:59.136 17:28:54 -- common/autotest_common.sh@10 -- # set +x 00:34:59.136 17:28:54 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:34:59.136 17:28:54 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:34:59.136 17:28:54 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:59.136 17:28:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:59.136 17:28:54 -- common/autotest_common.sh@10 -- # set +x 00:34:59.136 ************************************ 00:34:59.136 START TEST compress_compdev 00:34:59.136 ************************************ 00:34:59.136 17:28:54 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:34:59.136 * Looking for test storage... 00:34:59.136 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:34:59.136 17:28:54 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:59.136 17:28:54 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:59.136 17:28:54 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:59.136 17:28:54 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:59.136 17:28:54 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:59.136 17:28:54 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:59.136 17:28:54 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:59.136 17:28:54 compress_compdev -- paths/export.sh@5 -- # export PATH 00:34:59.136 17:28:54 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:34:59.136 17:28:54 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:34:59.136 17:28:54 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:34:59.136 17:28:54 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=89562 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:59.396 17:28:54 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 89562 00:34:59.396 17:28:54 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 89562 ']' 00:34:59.396 17:28:54 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:59.396 17:28:54 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:59.396 17:28:54 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:59.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:59.396 17:28:54 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:59.396 17:28:54 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:34:59.396 [2024-07-23 17:28:54.618992] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:34:59.396 [2024-07-23 17:28:54.619062] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89562 ] 00:34:59.396 [2024-07-23 17:28:54.757227] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:59.396 [2024-07-23 17:28:54.816285] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:59.396 [2024-07-23 17:28:54.816291] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:00.334 [2024-07-23 17:28:55.637201] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:35:00.334 17:28:55 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:00.334 17:28:55 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:35:00.334 17:28:55 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:35:00.334 17:28:55 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:35:00.334 17:28:55 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:35:00.902 [2024-07-23 17:28:56.303808] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xddd780 PMD being used: compress_qat 00:35:01.162 17:28:56 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:35:01.162 17:28:56 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:35:01.162 17:28:56 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:01.162 17:28:56 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:01.162 17:28:56 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:01.162 17:28:56 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:01.162 17:28:56 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:01.421 17:28:56 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:35:01.421 [ 00:35:01.421 { 00:35:01.421 "name": "Nvme0n1", 00:35:01.421 "aliases": [ 00:35:01.421 "01000000-0000-0000-5cd2-e43197705251" 00:35:01.421 ], 00:35:01.421 "product_name": "NVMe disk", 00:35:01.421 "block_size": 512, 00:35:01.421 "num_blocks": 15002931888, 00:35:01.421 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:35:01.421 "assigned_rate_limits": { 00:35:01.421 "rw_ios_per_sec": 0, 00:35:01.421 "rw_mbytes_per_sec": 0, 00:35:01.421 "r_mbytes_per_sec": 0, 00:35:01.421 "w_mbytes_per_sec": 0 00:35:01.421 }, 00:35:01.421 "claimed": false, 00:35:01.421 "zoned": false, 00:35:01.422 "supported_io_types": { 00:35:01.422 "read": true, 00:35:01.422 "write": true, 00:35:01.422 "unmap": true, 00:35:01.422 "flush": true, 00:35:01.422 "reset": true, 00:35:01.422 "nvme_admin": true, 00:35:01.422 "nvme_io": true, 00:35:01.422 "nvme_io_md": false, 00:35:01.422 "write_zeroes": true, 00:35:01.422 "zcopy": false, 00:35:01.422 "get_zone_info": false, 00:35:01.422 "zone_management": false, 00:35:01.422 "zone_append": false, 00:35:01.422 "compare": false, 00:35:01.422 "compare_and_write": false, 00:35:01.422 "abort": true, 00:35:01.422 "seek_hole": false, 00:35:01.422 "seek_data": false, 00:35:01.422 "copy": false, 00:35:01.422 "nvme_iov_md": false 00:35:01.422 }, 00:35:01.422 "driver_specific": { 00:35:01.422 "nvme": [ 00:35:01.422 { 00:35:01.422 "pci_address": "0000:5e:00.0", 00:35:01.422 "trid": { 00:35:01.422 "trtype": "PCIe", 00:35:01.422 "traddr": "0000:5e:00.0" 00:35:01.422 }, 00:35:01.422 "ctrlr_data": { 00:35:01.422 "cntlid": 0, 00:35:01.422 "vendor_id": "0x8086", 00:35:01.422 "model_number": "INTEL SSDPF2KX076TZO", 00:35:01.422 "serial_number": "PHAC0301002G7P6CGN", 00:35:01.422 "firmware_revision": "JCV10200", 00:35:01.422 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:35:01.422 "oacs": { 00:35:01.422 "security": 1, 00:35:01.422 "format": 1, 00:35:01.422 "firmware": 1, 00:35:01.422 "ns_manage": 1 00:35:01.422 }, 00:35:01.422 "multi_ctrlr": false, 00:35:01.422 "ana_reporting": false 00:35:01.422 }, 00:35:01.422 "vs": { 00:35:01.422 "nvme_version": "1.3" 00:35:01.422 }, 00:35:01.422 "ns_data": { 00:35:01.422 "id": 1, 00:35:01.422 "can_share": false 00:35:01.422 }, 00:35:01.422 "security": { 00:35:01.422 "opal": true 00:35:01.422 } 00:35:01.422 } 00:35:01.422 ], 00:35:01.422 "mp_policy": "active_passive" 00:35:01.422 } 00:35:01.422 } 00:35:01.422 ] 00:35:01.422 17:28:56 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:01.422 17:28:56 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:35:01.681 [2024-07-23 17:28:56.986416] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xc2b850 PMD being used: compress_qat 00:35:04.217 8414436c-3e59-4784-925d-44eaf6c9d3de 00:35:04.217 17:28:59 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:35:04.217 3c57ebae-b46b-4e9b-b48c-f65f340c2028 00:35:04.217 17:28:59 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:35:04.217 17:28:59 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:35:04.217 17:28:59 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:04.218 17:28:59 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:04.218 17:28:59 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:04.218 17:28:59 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:04.218 17:28:59 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:04.476 17:28:59 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:35:04.735 [ 00:35:04.735 { 00:35:04.735 "name": "3c57ebae-b46b-4e9b-b48c-f65f340c2028", 00:35:04.735 "aliases": [ 00:35:04.735 "lvs0/lv0" 00:35:04.735 ], 00:35:04.735 "product_name": "Logical Volume", 00:35:04.735 "block_size": 512, 00:35:04.735 "num_blocks": 204800, 00:35:04.735 "uuid": "3c57ebae-b46b-4e9b-b48c-f65f340c2028", 00:35:04.735 "assigned_rate_limits": { 00:35:04.735 "rw_ios_per_sec": 0, 00:35:04.735 "rw_mbytes_per_sec": 0, 00:35:04.735 "r_mbytes_per_sec": 0, 00:35:04.735 "w_mbytes_per_sec": 0 00:35:04.735 }, 00:35:04.735 "claimed": false, 00:35:04.735 "zoned": false, 00:35:04.735 "supported_io_types": { 00:35:04.735 "read": true, 00:35:04.735 "write": true, 00:35:04.735 "unmap": true, 00:35:04.735 "flush": false, 00:35:04.735 "reset": true, 00:35:04.735 "nvme_admin": false, 00:35:04.735 "nvme_io": false, 00:35:04.735 "nvme_io_md": false, 00:35:04.735 "write_zeroes": true, 00:35:04.735 "zcopy": false, 00:35:04.735 "get_zone_info": false, 00:35:04.735 "zone_management": false, 00:35:04.735 "zone_append": false, 00:35:04.735 "compare": false, 00:35:04.735 "compare_and_write": false, 00:35:04.735 "abort": false, 00:35:04.735 "seek_hole": true, 00:35:04.735 "seek_data": true, 00:35:04.735 "copy": false, 00:35:04.735 "nvme_iov_md": false 00:35:04.735 }, 00:35:04.735 "driver_specific": { 00:35:04.735 "lvol": { 00:35:04.735 "lvol_store_uuid": "8414436c-3e59-4784-925d-44eaf6c9d3de", 00:35:04.735 "base_bdev": "Nvme0n1", 00:35:04.735 "thin_provision": true, 00:35:04.735 "num_allocated_clusters": 0, 00:35:04.735 "snapshot": false, 00:35:04.735 "clone": false, 00:35:04.735 "esnap_clone": false 00:35:04.735 } 00:35:04.735 } 00:35:04.735 } 00:35:04.735 ] 00:35:04.735 17:28:59 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:04.735 17:28:59 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:35:04.735 17:28:59 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:35:04.994 [2024-07-23 17:29:00.186248] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:35:04.994 COMP_lvs0/lv0 00:35:04.994 17:29:00 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:35:04.994 17:29:00 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:35:04.994 17:29:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:04.994 17:29:00 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:04.994 17:29:00 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:04.994 17:29:00 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:04.994 17:29:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:05.253 17:29:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:35:05.253 [ 00:35:05.253 { 00:35:05.253 "name": "COMP_lvs0/lv0", 00:35:05.253 "aliases": [ 00:35:05.253 "a027431e-40fb-5b36-be7c-d82adf688e40" 00:35:05.253 ], 00:35:05.253 "product_name": "compress", 00:35:05.253 "block_size": 512, 00:35:05.253 "num_blocks": 200704, 00:35:05.253 "uuid": "a027431e-40fb-5b36-be7c-d82adf688e40", 00:35:05.253 "assigned_rate_limits": { 00:35:05.253 "rw_ios_per_sec": 0, 00:35:05.253 "rw_mbytes_per_sec": 0, 00:35:05.253 "r_mbytes_per_sec": 0, 00:35:05.253 "w_mbytes_per_sec": 0 00:35:05.253 }, 00:35:05.253 "claimed": false, 00:35:05.253 "zoned": false, 00:35:05.253 "supported_io_types": { 00:35:05.253 "read": true, 00:35:05.253 "write": true, 00:35:05.253 "unmap": false, 00:35:05.253 "flush": false, 00:35:05.253 "reset": false, 00:35:05.253 "nvme_admin": false, 00:35:05.253 "nvme_io": false, 00:35:05.253 "nvme_io_md": false, 00:35:05.253 "write_zeroes": true, 00:35:05.253 "zcopy": false, 00:35:05.253 "get_zone_info": false, 00:35:05.253 "zone_management": false, 00:35:05.253 "zone_append": false, 00:35:05.253 "compare": false, 00:35:05.253 "compare_and_write": false, 00:35:05.253 "abort": false, 00:35:05.253 "seek_hole": false, 00:35:05.253 "seek_data": false, 00:35:05.253 "copy": false, 00:35:05.253 "nvme_iov_md": false 00:35:05.253 }, 00:35:05.253 "driver_specific": { 00:35:05.253 "compress": { 00:35:05.253 "name": "COMP_lvs0/lv0", 00:35:05.253 "base_bdev_name": "3c57ebae-b46b-4e9b-b48c-f65f340c2028", 00:35:05.253 "pm_path": "/tmp/pmem/1e7babf3-f5ce-4836-9020-e40702978b64" 00:35:05.253 } 00:35:05.253 } 00:35:05.253 } 00:35:05.253 ] 00:35:05.512 17:29:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:05.512 17:29:00 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:05.512 [2024-07-23 17:29:00.794293] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xddb000 PMD being used: compress_qat 00:35:05.512 [2024-07-23 17:29:00.798406] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f679019bc10 PMD being used: compress_qat 00:35:05.512 Running I/O for 3 seconds... 00:35:08.798 00:35:08.798 Latency(us) 00:35:08.798 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:08.798 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:35:08.798 Verification LBA range: start 0x0 length 0x3100 00:35:08.798 COMP_lvs0/lv0 : 3.01 2796.20 10.92 0.00 0.00 11358.96 1061.40 9687.93 00:35:08.798 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:35:08.798 Verification LBA range: start 0x3100 length 0x3100 00:35:08.798 COMP_lvs0/lv0 : 3.01 2630.69 10.28 0.00 0.00 12030.23 1146.88 11226.60 00:35:08.798 =================================================================================================================== 00:35:08.798 Total : 5426.89 21.20 0.00 0.00 11684.43 1061.40 11226.60 00:35:08.798 0 00:35:08.798 17:29:03 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:35:08.798 17:29:03 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:35:08.798 17:29:04 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:35:09.056 17:29:04 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:35:09.056 17:29:04 compress_compdev -- compress/compress.sh@78 -- # killprocess 89562 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 89562 ']' 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 89562 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 89562 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 89562' 00:35:09.056 killing process with pid 89562 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@967 -- # kill 89562 00:35:09.056 Received shutdown signal, test time was about 3.000000 seconds 00:35:09.056 00:35:09.056 Latency(us) 00:35:09.056 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:09.056 =================================================================================================================== 00:35:09.056 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:09.056 17:29:04 compress_compdev -- common/autotest_common.sh@972 -- # wait 89562 00:35:12.341 17:29:07 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:35:12.341 17:29:07 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:35:12.341 17:29:07 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=91186 00:35:12.341 17:29:07 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:35:12.341 17:29:07 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 91186 00:35:12.341 17:29:07 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:35:12.341 17:29:07 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 91186 ']' 00:35:12.341 17:29:07 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:12.341 17:29:07 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:12.341 17:29:07 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:12.341 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:12.342 17:29:07 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:12.342 17:29:07 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:35:12.342 [2024-07-23 17:29:07.504172] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:35:12.342 [2024-07-23 17:29:07.504245] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid91186 ] 00:35:12.342 [2024-07-23 17:29:07.644370] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:12.342 [2024-07-23 17:29:07.705921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:12.342 [2024-07-23 17:29:07.705926] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:13.279 [2024-07-23 17:29:08.532684] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:35:13.279 17:29:08 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:13.279 17:29:08 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:35:13.279 17:29:08 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:35:13.279 17:29:08 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:35:13.279 17:29:08 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:35:13.847 [2024-07-23 17:29:09.199513] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x202b780 PMD being used: compress_qat 00:35:13.847 17:29:09 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:35:13.847 17:29:09 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:35:13.847 17:29:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:13.847 17:29:09 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:13.847 17:29:09 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:13.847 17:29:09 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:13.847 17:29:09 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:14.106 17:29:09 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:35:14.366 [ 00:35:14.366 { 00:35:14.366 "name": "Nvme0n1", 00:35:14.366 "aliases": [ 00:35:14.366 "01000000-0000-0000-5cd2-e43197705251" 00:35:14.366 ], 00:35:14.366 "product_name": "NVMe disk", 00:35:14.366 "block_size": 512, 00:35:14.366 "num_blocks": 15002931888, 00:35:14.366 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:35:14.366 "assigned_rate_limits": { 00:35:14.366 "rw_ios_per_sec": 0, 00:35:14.366 "rw_mbytes_per_sec": 0, 00:35:14.366 "r_mbytes_per_sec": 0, 00:35:14.366 "w_mbytes_per_sec": 0 00:35:14.366 }, 00:35:14.366 "claimed": false, 00:35:14.366 "zoned": false, 00:35:14.366 "supported_io_types": { 00:35:14.366 "read": true, 00:35:14.366 "write": true, 00:35:14.366 "unmap": true, 00:35:14.366 "flush": true, 00:35:14.366 "reset": true, 00:35:14.366 "nvme_admin": true, 00:35:14.366 "nvme_io": true, 00:35:14.366 "nvme_io_md": false, 00:35:14.366 "write_zeroes": true, 00:35:14.366 "zcopy": false, 00:35:14.366 "get_zone_info": false, 00:35:14.366 "zone_management": false, 00:35:14.366 "zone_append": false, 00:35:14.366 "compare": false, 00:35:14.366 "compare_and_write": false, 00:35:14.366 "abort": true, 00:35:14.366 "seek_hole": false, 00:35:14.366 "seek_data": false, 00:35:14.366 "copy": false, 00:35:14.366 "nvme_iov_md": false 00:35:14.366 }, 00:35:14.366 "driver_specific": { 00:35:14.366 "nvme": [ 00:35:14.366 { 00:35:14.366 "pci_address": "0000:5e:00.0", 00:35:14.366 "trid": { 00:35:14.366 "trtype": "PCIe", 00:35:14.366 "traddr": "0000:5e:00.0" 00:35:14.366 }, 00:35:14.366 "ctrlr_data": { 00:35:14.366 "cntlid": 0, 00:35:14.366 "vendor_id": "0x8086", 00:35:14.366 "model_number": "INTEL SSDPF2KX076TZO", 00:35:14.366 "serial_number": "PHAC0301002G7P6CGN", 00:35:14.366 "firmware_revision": "JCV10200", 00:35:14.366 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:35:14.366 "oacs": { 00:35:14.366 "security": 1, 00:35:14.366 "format": 1, 00:35:14.366 "firmware": 1, 00:35:14.366 "ns_manage": 1 00:35:14.366 }, 00:35:14.366 "multi_ctrlr": false, 00:35:14.366 "ana_reporting": false 00:35:14.366 }, 00:35:14.366 "vs": { 00:35:14.366 "nvme_version": "1.3" 00:35:14.366 }, 00:35:14.366 "ns_data": { 00:35:14.366 "id": 1, 00:35:14.366 "can_share": false 00:35:14.366 }, 00:35:14.366 "security": { 00:35:14.366 "opal": true 00:35:14.366 } 00:35:14.366 } 00:35:14.366 ], 00:35:14.366 "mp_policy": "active_passive" 00:35:14.366 } 00:35:14.366 } 00:35:14.366 ] 00:35:14.366 17:29:09 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:14.366 17:29:09 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:35:14.626 [2024-07-23 17:29:09.894205] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e79850 PMD being used: compress_qat 00:35:17.199 c7a4b036-35da-4bb9-beae-ed5376e425c7 00:35:17.199 17:29:12 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:35:17.199 f0551d8a-4019-41b2-8875-2627ce2ff6a6 00:35:17.199 17:29:12 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:17.199 17:29:12 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:35:17.458 [ 00:35:17.458 { 00:35:17.458 "name": "f0551d8a-4019-41b2-8875-2627ce2ff6a6", 00:35:17.458 "aliases": [ 00:35:17.458 "lvs0/lv0" 00:35:17.458 ], 00:35:17.458 "product_name": "Logical Volume", 00:35:17.458 "block_size": 512, 00:35:17.458 "num_blocks": 204800, 00:35:17.458 "uuid": "f0551d8a-4019-41b2-8875-2627ce2ff6a6", 00:35:17.458 "assigned_rate_limits": { 00:35:17.458 "rw_ios_per_sec": 0, 00:35:17.458 "rw_mbytes_per_sec": 0, 00:35:17.458 "r_mbytes_per_sec": 0, 00:35:17.458 "w_mbytes_per_sec": 0 00:35:17.458 }, 00:35:17.458 "claimed": false, 00:35:17.458 "zoned": false, 00:35:17.458 "supported_io_types": { 00:35:17.458 "read": true, 00:35:17.458 "write": true, 00:35:17.458 "unmap": true, 00:35:17.458 "flush": false, 00:35:17.458 "reset": true, 00:35:17.458 "nvme_admin": false, 00:35:17.458 "nvme_io": false, 00:35:17.458 "nvme_io_md": false, 00:35:17.458 "write_zeroes": true, 00:35:17.458 "zcopy": false, 00:35:17.458 "get_zone_info": false, 00:35:17.458 "zone_management": false, 00:35:17.458 "zone_append": false, 00:35:17.458 "compare": false, 00:35:17.458 "compare_and_write": false, 00:35:17.458 "abort": false, 00:35:17.458 "seek_hole": true, 00:35:17.458 "seek_data": true, 00:35:17.458 "copy": false, 00:35:17.458 "nvme_iov_md": false 00:35:17.458 }, 00:35:17.458 "driver_specific": { 00:35:17.458 "lvol": { 00:35:17.458 "lvol_store_uuid": "c7a4b036-35da-4bb9-beae-ed5376e425c7", 00:35:17.458 "base_bdev": "Nvme0n1", 00:35:17.458 "thin_provision": true, 00:35:17.458 "num_allocated_clusters": 0, 00:35:17.458 "snapshot": false, 00:35:17.458 "clone": false, 00:35:17.458 "esnap_clone": false 00:35:17.458 } 00:35:17.458 } 00:35:17.458 } 00:35:17.458 ] 00:35:17.458 17:29:12 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:17.458 17:29:12 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:35:17.458 17:29:12 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:35:17.717 [2024-07-23 17:29:13.021951] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:35:17.717 COMP_lvs0/lv0 00:35:17.717 17:29:13 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:35:17.718 17:29:13 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:35:17.718 17:29:13 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:17.718 17:29:13 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:17.718 17:29:13 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:17.718 17:29:13 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:17.718 17:29:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:17.976 17:29:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:35:18.234 [ 00:35:18.234 { 00:35:18.234 "name": "COMP_lvs0/lv0", 00:35:18.234 "aliases": [ 00:35:18.234 "96cd5ea6-36c2-55fc-8c9f-572990577a75" 00:35:18.234 ], 00:35:18.234 "product_name": "compress", 00:35:18.234 "block_size": 512, 00:35:18.234 "num_blocks": 200704, 00:35:18.234 "uuid": "96cd5ea6-36c2-55fc-8c9f-572990577a75", 00:35:18.234 "assigned_rate_limits": { 00:35:18.234 "rw_ios_per_sec": 0, 00:35:18.234 "rw_mbytes_per_sec": 0, 00:35:18.234 "r_mbytes_per_sec": 0, 00:35:18.234 "w_mbytes_per_sec": 0 00:35:18.234 }, 00:35:18.234 "claimed": false, 00:35:18.234 "zoned": false, 00:35:18.234 "supported_io_types": { 00:35:18.234 "read": true, 00:35:18.234 "write": true, 00:35:18.234 "unmap": false, 00:35:18.234 "flush": false, 00:35:18.234 "reset": false, 00:35:18.234 "nvme_admin": false, 00:35:18.234 "nvme_io": false, 00:35:18.234 "nvme_io_md": false, 00:35:18.234 "write_zeroes": true, 00:35:18.234 "zcopy": false, 00:35:18.234 "get_zone_info": false, 00:35:18.234 "zone_management": false, 00:35:18.234 "zone_append": false, 00:35:18.234 "compare": false, 00:35:18.234 "compare_and_write": false, 00:35:18.234 "abort": false, 00:35:18.234 "seek_hole": false, 00:35:18.234 "seek_data": false, 00:35:18.234 "copy": false, 00:35:18.234 "nvme_iov_md": false 00:35:18.234 }, 00:35:18.234 "driver_specific": { 00:35:18.234 "compress": { 00:35:18.234 "name": "COMP_lvs0/lv0", 00:35:18.234 "base_bdev_name": "f0551d8a-4019-41b2-8875-2627ce2ff6a6", 00:35:18.234 "pm_path": "/tmp/pmem/eff927d7-ce2f-466f-a6b1-bcaf644aa582" 00:35:18.234 } 00:35:18.234 } 00:35:18.234 } 00:35:18.234 ] 00:35:18.234 17:29:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:18.234 17:29:13 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:18.492 [2024-07-23 17:29:13.680187] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fe4c41b15c0 PMD being used: compress_qat 00:35:18.492 [2024-07-23 17:29:13.683372] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2028980 PMD being used: compress_qat 00:35:18.492 Running I/O for 3 seconds... 00:35:21.782 00:35:21.782 Latency(us) 00:35:21.782 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:21.782 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:35:21.782 Verification LBA range: start 0x0 length 0x3100 00:35:21.782 COMP_lvs0/lv0 : 3.01 1680.61 6.56 0.00 0.00 18949.33 1937.59 18464.06 00:35:21.782 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:35:21.782 Verification LBA range: start 0x3100 length 0x3100 00:35:21.782 COMP_lvs0/lv0 : 3.01 1783.88 6.97 0.00 0.00 17824.61 1125.51 14816.83 00:35:21.782 =================================================================================================================== 00:35:21.782 Total : 3464.49 13.53 0.00 0.00 18370.20 1125.51 18464.06 00:35:21.782 0 00:35:21.782 17:29:16 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:35:21.782 17:29:16 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:35:21.782 17:29:16 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:35:22.042 17:29:17 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:35:22.042 17:29:17 compress_compdev -- compress/compress.sh@78 -- # killprocess 91186 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 91186 ']' 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 91186 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 91186 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 91186' 00:35:22.042 killing process with pid 91186 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@967 -- # kill 91186 00:35:22.042 Received shutdown signal, test time was about 3.000000 seconds 00:35:22.042 00:35:22.042 Latency(us) 00:35:22.042 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:22.042 =================================================================================================================== 00:35:22.042 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:22.042 17:29:17 compress_compdev -- common/autotest_common.sh@972 -- # wait 91186 00:35:25.335 17:29:20 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:35:25.335 17:29:20 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:35:25.335 17:29:20 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=92928 00:35:25.335 17:29:20 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:35:25.335 17:29:20 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:35:25.335 17:29:20 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 92928 00:35:25.335 17:29:20 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 92928 ']' 00:35:25.335 17:29:20 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:25.335 17:29:20 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:25.335 17:29:20 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:25.335 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:25.335 17:29:20 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:25.335 17:29:20 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:35:25.335 [2024-07-23 17:29:20.417132] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:35:25.335 [2024-07-23 17:29:20.417205] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid92928 ] 00:35:25.335 [2024-07-23 17:29:20.554313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:25.335 [2024-07-23 17:29:20.612013] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:25.335 [2024-07-23 17:29:20.612018] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:26.276 [2024-07-23 17:29:21.426337] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:35:26.276 17:29:21 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:26.276 17:29:21 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:35:26.276 17:29:21 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:35:26.276 17:29:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:35:26.276 17:29:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:35:26.845 [2024-07-23 17:29:22.095415] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2024780 PMD being used: compress_qat 00:35:26.846 17:29:22 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:35:26.846 17:29:22 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:35:26.846 17:29:22 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:26.846 17:29:22 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:26.846 17:29:22 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:26.846 17:29:22 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:26.846 17:29:22 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:27.105 17:29:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:35:27.365 [ 00:35:27.365 { 00:35:27.365 "name": "Nvme0n1", 00:35:27.365 "aliases": [ 00:35:27.365 "01000000-0000-0000-5cd2-e43197705251" 00:35:27.365 ], 00:35:27.365 "product_name": "NVMe disk", 00:35:27.365 "block_size": 512, 00:35:27.365 "num_blocks": 15002931888, 00:35:27.365 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:35:27.365 "assigned_rate_limits": { 00:35:27.365 "rw_ios_per_sec": 0, 00:35:27.365 "rw_mbytes_per_sec": 0, 00:35:27.365 "r_mbytes_per_sec": 0, 00:35:27.365 "w_mbytes_per_sec": 0 00:35:27.365 }, 00:35:27.365 "claimed": false, 00:35:27.365 "zoned": false, 00:35:27.365 "supported_io_types": { 00:35:27.365 "read": true, 00:35:27.365 "write": true, 00:35:27.365 "unmap": true, 00:35:27.365 "flush": true, 00:35:27.365 "reset": true, 00:35:27.365 "nvme_admin": true, 00:35:27.365 "nvme_io": true, 00:35:27.365 "nvme_io_md": false, 00:35:27.365 "write_zeroes": true, 00:35:27.365 "zcopy": false, 00:35:27.365 "get_zone_info": false, 00:35:27.365 "zone_management": false, 00:35:27.365 "zone_append": false, 00:35:27.365 "compare": false, 00:35:27.365 "compare_and_write": false, 00:35:27.365 "abort": true, 00:35:27.365 "seek_hole": false, 00:35:27.365 "seek_data": false, 00:35:27.365 "copy": false, 00:35:27.365 "nvme_iov_md": false 00:35:27.365 }, 00:35:27.365 "driver_specific": { 00:35:27.365 "nvme": [ 00:35:27.365 { 00:35:27.365 "pci_address": "0000:5e:00.0", 00:35:27.365 "trid": { 00:35:27.365 "trtype": "PCIe", 00:35:27.365 "traddr": "0000:5e:00.0" 00:35:27.365 }, 00:35:27.365 "ctrlr_data": { 00:35:27.365 "cntlid": 0, 00:35:27.365 "vendor_id": "0x8086", 00:35:27.365 "model_number": "INTEL SSDPF2KX076TZO", 00:35:27.365 "serial_number": "PHAC0301002G7P6CGN", 00:35:27.365 "firmware_revision": "JCV10200", 00:35:27.365 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:35:27.365 "oacs": { 00:35:27.365 "security": 1, 00:35:27.365 "format": 1, 00:35:27.365 "firmware": 1, 00:35:27.365 "ns_manage": 1 00:35:27.365 }, 00:35:27.365 "multi_ctrlr": false, 00:35:27.365 "ana_reporting": false 00:35:27.365 }, 00:35:27.365 "vs": { 00:35:27.365 "nvme_version": "1.3" 00:35:27.365 }, 00:35:27.365 "ns_data": { 00:35:27.365 "id": 1, 00:35:27.365 "can_share": false 00:35:27.365 }, 00:35:27.365 "security": { 00:35:27.365 "opal": true 00:35:27.365 } 00:35:27.365 } 00:35:27.365 ], 00:35:27.365 "mp_policy": "active_passive" 00:35:27.365 } 00:35:27.365 } 00:35:27.365 ] 00:35:27.365 17:29:22 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:27.365 17:29:22 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:35:27.625 [2024-07-23 17:29:22.805659] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e73760 PMD being used: compress_qat 00:35:30.160 21cd4901-8aa1-45c1-918e-8b4db6ac639a 00:35:30.160 17:29:25 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:35:30.160 b40cba9c-780a-4401-a710-dcbaad6b5873 00:35:30.160 17:29:25 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:30.160 17:29:25 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:35:30.420 [ 00:35:30.420 { 00:35:30.420 "name": "b40cba9c-780a-4401-a710-dcbaad6b5873", 00:35:30.420 "aliases": [ 00:35:30.420 "lvs0/lv0" 00:35:30.420 ], 00:35:30.420 "product_name": "Logical Volume", 00:35:30.420 "block_size": 512, 00:35:30.420 "num_blocks": 204800, 00:35:30.420 "uuid": "b40cba9c-780a-4401-a710-dcbaad6b5873", 00:35:30.420 "assigned_rate_limits": { 00:35:30.420 "rw_ios_per_sec": 0, 00:35:30.420 "rw_mbytes_per_sec": 0, 00:35:30.420 "r_mbytes_per_sec": 0, 00:35:30.420 "w_mbytes_per_sec": 0 00:35:30.420 }, 00:35:30.420 "claimed": false, 00:35:30.420 "zoned": false, 00:35:30.420 "supported_io_types": { 00:35:30.420 "read": true, 00:35:30.420 "write": true, 00:35:30.420 "unmap": true, 00:35:30.420 "flush": false, 00:35:30.420 "reset": true, 00:35:30.420 "nvme_admin": false, 00:35:30.420 "nvme_io": false, 00:35:30.420 "nvme_io_md": false, 00:35:30.420 "write_zeroes": true, 00:35:30.420 "zcopy": false, 00:35:30.420 "get_zone_info": false, 00:35:30.420 "zone_management": false, 00:35:30.420 "zone_append": false, 00:35:30.420 "compare": false, 00:35:30.420 "compare_and_write": false, 00:35:30.420 "abort": false, 00:35:30.420 "seek_hole": true, 00:35:30.420 "seek_data": true, 00:35:30.420 "copy": false, 00:35:30.420 "nvme_iov_md": false 00:35:30.420 }, 00:35:30.420 "driver_specific": { 00:35:30.420 "lvol": { 00:35:30.420 "lvol_store_uuid": "21cd4901-8aa1-45c1-918e-8b4db6ac639a", 00:35:30.420 "base_bdev": "Nvme0n1", 00:35:30.420 "thin_provision": true, 00:35:30.420 "num_allocated_clusters": 0, 00:35:30.420 "snapshot": false, 00:35:30.420 "clone": false, 00:35:30.420 "esnap_clone": false 00:35:30.420 } 00:35:30.420 } 00:35:30.420 } 00:35:30.420 ] 00:35:30.420 17:29:25 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:30.420 17:29:25 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:35:30.420 17:29:25 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:35:30.680 [2024-07-23 17:29:25.970793] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:35:30.680 COMP_lvs0/lv0 00:35:30.680 17:29:25 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:35:30.680 17:29:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:35:30.680 17:29:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:30.680 17:29:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:30.680 17:29:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:30.680 17:29:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:30.680 17:29:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:30.939 17:29:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:35:31.198 [ 00:35:31.198 { 00:35:31.198 "name": "COMP_lvs0/lv0", 00:35:31.198 "aliases": [ 00:35:31.198 "01c55306-c1cc-5a8a-8cd8-a5bf13698378" 00:35:31.198 ], 00:35:31.198 "product_name": "compress", 00:35:31.198 "block_size": 4096, 00:35:31.198 "num_blocks": 25088, 00:35:31.198 "uuid": "01c55306-c1cc-5a8a-8cd8-a5bf13698378", 00:35:31.198 "assigned_rate_limits": { 00:35:31.198 "rw_ios_per_sec": 0, 00:35:31.198 "rw_mbytes_per_sec": 0, 00:35:31.198 "r_mbytes_per_sec": 0, 00:35:31.198 "w_mbytes_per_sec": 0 00:35:31.198 }, 00:35:31.198 "claimed": false, 00:35:31.198 "zoned": false, 00:35:31.198 "supported_io_types": { 00:35:31.198 "read": true, 00:35:31.198 "write": true, 00:35:31.198 "unmap": false, 00:35:31.198 "flush": false, 00:35:31.198 "reset": false, 00:35:31.198 "nvme_admin": false, 00:35:31.198 "nvme_io": false, 00:35:31.198 "nvme_io_md": false, 00:35:31.198 "write_zeroes": true, 00:35:31.198 "zcopy": false, 00:35:31.198 "get_zone_info": false, 00:35:31.198 "zone_management": false, 00:35:31.198 "zone_append": false, 00:35:31.198 "compare": false, 00:35:31.198 "compare_and_write": false, 00:35:31.198 "abort": false, 00:35:31.198 "seek_hole": false, 00:35:31.198 "seek_data": false, 00:35:31.198 "copy": false, 00:35:31.198 "nvme_iov_md": false 00:35:31.198 }, 00:35:31.198 "driver_specific": { 00:35:31.198 "compress": { 00:35:31.198 "name": "COMP_lvs0/lv0", 00:35:31.198 "base_bdev_name": "b40cba9c-780a-4401-a710-dcbaad6b5873", 00:35:31.198 "pm_path": "/tmp/pmem/82e7c34a-c836-4789-895f-3729d1eabac8" 00:35:31.198 } 00:35:31.198 } 00:35:31.198 } 00:35:31.198 ] 00:35:31.198 17:29:26 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:31.198 17:29:26 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:31.198 [2024-07-23 17:29:26.568794] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f58581b15c0 PMD being used: compress_qat 00:35:31.198 [2024-07-23 17:29:26.571970] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20217c0 PMD being used: compress_qat 00:35:31.198 Running I/O for 3 seconds... 00:35:34.485 00:35:34.485 Latency(us) 00:35:34.485 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:34.485 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:35:34.485 Verification LBA range: start 0x0 length 0x3100 00:35:34.485 COMP_lvs0/lv0 : 3.01 1697.91 6.63 0.00 0.00 18760.42 1837.86 16526.47 00:35:34.485 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:35:34.485 Verification LBA range: start 0x3100 length 0x3100 00:35:34.485 COMP_lvs0/lv0 : 3.01 1814.75 7.09 0.00 0.00 17521.74 1139.76 14816.83 00:35:34.485 =================================================================================================================== 00:35:34.485 Total : 3512.67 13.72 0.00 0.00 18120.46 1139.76 16526.47 00:35:34.485 0 00:35:34.485 17:29:29 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:35:34.485 17:29:29 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:35:34.485 17:29:29 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:35:34.744 17:29:30 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:35:34.744 17:29:30 compress_compdev -- compress/compress.sh@78 -- # killprocess 92928 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 92928 ']' 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 92928 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 92928 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 92928' 00:35:34.744 killing process with pid 92928 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@967 -- # kill 92928 00:35:34.744 Received shutdown signal, test time was about 3.000000 seconds 00:35:34.744 00:35:34.744 Latency(us) 00:35:34.744 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:34.744 =================================================================================================================== 00:35:34.744 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:34.744 17:29:30 compress_compdev -- common/autotest_common.sh@972 -- # wait 92928 00:35:38.048 17:29:33 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:35:38.048 17:29:33 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:35:38.048 17:29:33 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=94534 00:35:38.048 17:29:33 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:35:38.048 17:29:33 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:35:38.048 17:29:33 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 94534 00:35:38.048 17:29:33 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 94534 ']' 00:35:38.048 17:29:33 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:38.048 17:29:33 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:38.048 17:29:33 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:38.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:38.048 17:29:33 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:38.048 17:29:33 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:35:38.048 [2024-07-23 17:29:33.270988] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:35:38.049 [2024-07-23 17:29:33.271060] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid94534 ] 00:35:38.049 [2024-07-23 17:29:33.394639] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:35:38.049 [2024-07-23 17:29:33.449921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:38.049 [2024-07-23 17:29:33.449959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:38.049 [2024-07-23 17:29:33.449961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:38.984 [2024-07-23 17:29:34.095326] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:35:38.985 17:29:34 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:38.985 17:29:34 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:35:38.985 17:29:34 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:35:38.985 17:29:34 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:35:38.985 17:29:34 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:35:39.552 [2024-07-23 17:29:34.727292] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24b93c0 PMD being used: compress_qat 00:35:39.552 17:29:34 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:35:39.553 17:29:34 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:35:39.553 17:29:34 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:39.553 17:29:34 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:39.553 17:29:34 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:39.553 17:29:34 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:39.553 17:29:34 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:39.811 17:29:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:35:40.070 [ 00:35:40.070 { 00:35:40.070 "name": "Nvme0n1", 00:35:40.070 "aliases": [ 00:35:40.070 "01000000-0000-0000-5cd2-e43197705251" 00:35:40.070 ], 00:35:40.070 "product_name": "NVMe disk", 00:35:40.071 "block_size": 512, 00:35:40.071 "num_blocks": 15002931888, 00:35:40.071 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:35:40.071 "assigned_rate_limits": { 00:35:40.071 "rw_ios_per_sec": 0, 00:35:40.071 "rw_mbytes_per_sec": 0, 00:35:40.071 "r_mbytes_per_sec": 0, 00:35:40.071 "w_mbytes_per_sec": 0 00:35:40.071 }, 00:35:40.071 "claimed": false, 00:35:40.071 "zoned": false, 00:35:40.071 "supported_io_types": { 00:35:40.071 "read": true, 00:35:40.071 "write": true, 00:35:40.071 "unmap": true, 00:35:40.071 "flush": true, 00:35:40.071 "reset": true, 00:35:40.071 "nvme_admin": true, 00:35:40.071 "nvme_io": true, 00:35:40.071 "nvme_io_md": false, 00:35:40.071 "write_zeroes": true, 00:35:40.071 "zcopy": false, 00:35:40.071 "get_zone_info": false, 00:35:40.071 "zone_management": false, 00:35:40.071 "zone_append": false, 00:35:40.071 "compare": false, 00:35:40.071 "compare_and_write": false, 00:35:40.071 "abort": true, 00:35:40.071 "seek_hole": false, 00:35:40.071 "seek_data": false, 00:35:40.071 "copy": false, 00:35:40.071 "nvme_iov_md": false 00:35:40.071 }, 00:35:40.071 "driver_specific": { 00:35:40.071 "nvme": [ 00:35:40.071 { 00:35:40.071 "pci_address": "0000:5e:00.0", 00:35:40.071 "trid": { 00:35:40.071 "trtype": "PCIe", 00:35:40.071 "traddr": "0000:5e:00.0" 00:35:40.071 }, 00:35:40.071 "ctrlr_data": { 00:35:40.071 "cntlid": 0, 00:35:40.071 "vendor_id": "0x8086", 00:35:40.071 "model_number": "INTEL SSDPF2KX076TZO", 00:35:40.071 "serial_number": "PHAC0301002G7P6CGN", 00:35:40.071 "firmware_revision": "JCV10200", 00:35:40.071 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:35:40.071 "oacs": { 00:35:40.071 "security": 1, 00:35:40.071 "format": 1, 00:35:40.071 "firmware": 1, 00:35:40.071 "ns_manage": 1 00:35:40.071 }, 00:35:40.071 "multi_ctrlr": false, 00:35:40.071 "ana_reporting": false 00:35:40.071 }, 00:35:40.071 "vs": { 00:35:40.071 "nvme_version": "1.3" 00:35:40.071 }, 00:35:40.071 "ns_data": { 00:35:40.071 "id": 1, 00:35:40.071 "can_share": false 00:35:40.071 }, 00:35:40.071 "security": { 00:35:40.071 "opal": true 00:35:40.071 } 00:35:40.071 } 00:35:40.071 ], 00:35:40.071 "mp_policy": "active_passive" 00:35:40.071 } 00:35:40.071 } 00:35:40.071 ] 00:35:40.071 17:29:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:40.071 17:29:35 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:35:40.071 [2024-07-23 17:29:35.465063] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2307490 PMD being used: compress_qat 00:35:42.647 6a64b0ee-44a3-45da-8a4f-7f6090b5ec7f 00:35:42.647 17:29:37 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:35:42.647 4a917fdc-8b57-4513-92f4-dd4672080162 00:35:42.647 17:29:37 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:35:42.647 17:29:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:35:42.647 17:29:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:42.647 17:29:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:42.647 17:29:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:42.647 17:29:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:42.647 17:29:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:42.907 17:29:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:35:42.907 [ 00:35:42.907 { 00:35:42.907 "name": "4a917fdc-8b57-4513-92f4-dd4672080162", 00:35:42.907 "aliases": [ 00:35:42.907 "lvs0/lv0" 00:35:42.907 ], 00:35:42.907 "product_name": "Logical Volume", 00:35:42.907 "block_size": 512, 00:35:42.907 "num_blocks": 204800, 00:35:42.907 "uuid": "4a917fdc-8b57-4513-92f4-dd4672080162", 00:35:42.907 "assigned_rate_limits": { 00:35:42.907 "rw_ios_per_sec": 0, 00:35:42.907 "rw_mbytes_per_sec": 0, 00:35:42.907 "r_mbytes_per_sec": 0, 00:35:42.907 "w_mbytes_per_sec": 0 00:35:42.907 }, 00:35:42.907 "claimed": false, 00:35:42.907 "zoned": false, 00:35:42.907 "supported_io_types": { 00:35:42.907 "read": true, 00:35:42.907 "write": true, 00:35:42.907 "unmap": true, 00:35:42.907 "flush": false, 00:35:42.907 "reset": true, 00:35:42.907 "nvme_admin": false, 00:35:42.907 "nvme_io": false, 00:35:42.907 "nvme_io_md": false, 00:35:42.907 "write_zeroes": true, 00:35:42.907 "zcopy": false, 00:35:42.907 "get_zone_info": false, 00:35:42.907 "zone_management": false, 00:35:42.907 "zone_append": false, 00:35:42.907 "compare": false, 00:35:42.907 "compare_and_write": false, 00:35:42.907 "abort": false, 00:35:42.907 "seek_hole": true, 00:35:42.907 "seek_data": true, 00:35:42.907 "copy": false, 00:35:42.907 "nvme_iov_md": false 00:35:42.907 }, 00:35:42.907 "driver_specific": { 00:35:42.907 "lvol": { 00:35:42.907 "lvol_store_uuid": "6a64b0ee-44a3-45da-8a4f-7f6090b5ec7f", 00:35:42.907 "base_bdev": "Nvme0n1", 00:35:42.907 "thin_provision": true, 00:35:42.907 "num_allocated_clusters": 0, 00:35:42.907 "snapshot": false, 00:35:42.907 "clone": false, 00:35:42.907 "esnap_clone": false 00:35:42.907 } 00:35:42.907 } 00:35:42.907 } 00:35:42.907 ] 00:35:43.166 17:29:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:43.166 17:29:38 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:35:43.166 17:29:38 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:35:43.166 [2024-07-23 17:29:38.576851] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:35:43.166 COMP_lvs0/lv0 00:35:43.425 17:29:38 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:35:43.425 17:29:38 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:35:43.425 17:29:38 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:43.425 17:29:38 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:43.425 17:29:38 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:43.425 17:29:38 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:43.425 17:29:38 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:43.684 17:29:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:35:43.684 [ 00:35:43.684 { 00:35:43.684 "name": "COMP_lvs0/lv0", 00:35:43.684 "aliases": [ 00:35:43.684 "5ebfbbcf-97f5-5d47-b612-0a48a03967d2" 00:35:43.684 ], 00:35:43.684 "product_name": "compress", 00:35:43.684 "block_size": 512, 00:35:43.684 "num_blocks": 200704, 00:35:43.684 "uuid": "5ebfbbcf-97f5-5d47-b612-0a48a03967d2", 00:35:43.684 "assigned_rate_limits": { 00:35:43.684 "rw_ios_per_sec": 0, 00:35:43.684 "rw_mbytes_per_sec": 0, 00:35:43.684 "r_mbytes_per_sec": 0, 00:35:43.684 "w_mbytes_per_sec": 0 00:35:43.684 }, 00:35:43.684 "claimed": false, 00:35:43.684 "zoned": false, 00:35:43.684 "supported_io_types": { 00:35:43.684 "read": true, 00:35:43.684 "write": true, 00:35:43.684 "unmap": false, 00:35:43.684 "flush": false, 00:35:43.684 "reset": false, 00:35:43.684 "nvme_admin": false, 00:35:43.684 "nvme_io": false, 00:35:43.684 "nvme_io_md": false, 00:35:43.684 "write_zeroes": true, 00:35:43.684 "zcopy": false, 00:35:43.684 "get_zone_info": false, 00:35:43.684 "zone_management": false, 00:35:43.684 "zone_append": false, 00:35:43.684 "compare": false, 00:35:43.684 "compare_and_write": false, 00:35:43.684 "abort": false, 00:35:43.684 "seek_hole": false, 00:35:43.684 "seek_data": false, 00:35:43.684 "copy": false, 00:35:43.684 "nvme_iov_md": false 00:35:43.684 }, 00:35:43.684 "driver_specific": { 00:35:43.684 "compress": { 00:35:43.684 "name": "COMP_lvs0/lv0", 00:35:43.684 "base_bdev_name": "4a917fdc-8b57-4513-92f4-dd4672080162", 00:35:43.684 "pm_path": "/tmp/pmem/c7efddec-9404-4d09-9bf0-695517f0902c" 00:35:43.684 } 00:35:43.684 } 00:35:43.685 } 00:35:43.685 ] 00:35:43.944 17:29:39 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:43.944 17:29:39 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:35:43.944 [2024-07-23 17:29:39.237846] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fb2f41b1350 PMD being used: compress_qat 00:35:43.944 I/O targets: 00:35:43.944 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:35:43.944 00:35:43.944 00:35:43.944 CUnit - A unit testing framework for C - Version 2.1-3 00:35:43.944 http://cunit.sourceforge.net/ 00:35:43.944 00:35:43.944 00:35:43.944 Suite: bdevio tests on: COMP_lvs0/lv0 00:35:43.944 Test: blockdev write read block ...passed 00:35:43.944 Test: blockdev write zeroes read block ...passed 00:35:43.944 Test: blockdev write zeroes read no split ...passed 00:35:43.944 Test: blockdev write zeroes read split ...passed 00:35:43.944 Test: blockdev write zeroes read split partial ...passed 00:35:43.944 Test: blockdev reset ...[2024-07-23 17:29:39.341765] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:35:43.944 passed 00:35:43.944 Test: blockdev write read 8 blocks ...passed 00:35:43.944 Test: blockdev write read size > 128k ...passed 00:35:43.944 Test: blockdev write read invalid size ...passed 00:35:43.944 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:43.944 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:43.944 Test: blockdev write read max offset ...passed 00:35:43.944 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:43.944 Test: blockdev writev readv 8 blocks ...passed 00:35:43.944 Test: blockdev writev readv 30 x 1block ...passed 00:35:43.944 Test: blockdev writev readv block ...passed 00:35:43.944 Test: blockdev writev readv size > 128k ...passed 00:35:43.944 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:43.944 Test: blockdev comparev and writev ...passed 00:35:43.944 Test: blockdev nvme passthru rw ...passed 00:35:43.944 Test: blockdev nvme passthru vendor specific ...passed 00:35:43.944 Test: blockdev nvme admin passthru ...passed 00:35:43.944 Test: blockdev copy ...passed 00:35:43.944 00:35:43.944 Run Summary: Type Total Ran Passed Failed Inactive 00:35:43.944 suites 1 1 n/a 0 0 00:35:43.944 tests 23 23 23 0 0 00:35:43.944 asserts 130 130 130 0 n/a 00:35:43.944 00:35:43.944 Elapsed time = 0.236 seconds 00:35:43.944 0 00:35:44.203 17:29:39 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:35:44.203 17:29:39 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:35:44.462 17:29:39 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:35:44.720 17:29:39 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:35:44.720 17:29:39 compress_compdev -- compress/compress.sh@62 -- # killprocess 94534 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 94534 ']' 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 94534 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 94534 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 94534' 00:35:44.720 killing process with pid 94534 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@967 -- # kill 94534 00:35:44.720 17:29:39 compress_compdev -- common/autotest_common.sh@972 -- # wait 94534 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@91 -- # '[' 1 -eq 1 ']' 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@92 -- # run_bdevperf 64 16384 30 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=95782 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 64 -o 16384 -w verify -t 30 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:35:48.008 17:29:42 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 95782 00:35:48.008 17:29:42 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 95782 ']' 00:35:48.008 17:29:42 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:48.008 17:29:42 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:48.008 17:29:42 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:48.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:48.009 17:29:42 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:48.009 17:29:42 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:35:48.009 [2024-07-23 17:29:42.775931] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:35:48.009 [2024-07-23 17:29:42.776004] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95782 ] 00:35:48.009 [2024-07-23 17:29:42.912446] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:48.009 [2024-07-23 17:29:42.969779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:48.009 [2024-07-23 17:29:42.969784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:48.575 [2024-07-23 17:29:43.780719] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:35:48.575 17:29:43 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:48.575 17:29:43 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:35:48.575 17:29:43 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:35:48.575 17:29:43 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:35:48.575 17:29:43 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:35:49.143 [2024-07-23 17:29:44.391723] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c25780 PMD being used: compress_qat 00:35:49.143 17:29:44 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:35:49.143 17:29:44 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:35:49.143 17:29:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:49.143 17:29:44 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:49.143 17:29:44 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:49.143 17:29:44 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:49.143 17:29:44 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:49.401 17:29:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:35:49.660 [ 00:35:49.660 { 00:35:49.660 "name": "Nvme0n1", 00:35:49.660 "aliases": [ 00:35:49.660 "01000000-0000-0000-5cd2-e43197705251" 00:35:49.660 ], 00:35:49.660 "product_name": "NVMe disk", 00:35:49.660 "block_size": 512, 00:35:49.660 "num_blocks": 15002931888, 00:35:49.660 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:35:49.660 "assigned_rate_limits": { 00:35:49.660 "rw_ios_per_sec": 0, 00:35:49.660 "rw_mbytes_per_sec": 0, 00:35:49.660 "r_mbytes_per_sec": 0, 00:35:49.660 "w_mbytes_per_sec": 0 00:35:49.660 }, 00:35:49.660 "claimed": false, 00:35:49.660 "zoned": false, 00:35:49.660 "supported_io_types": { 00:35:49.660 "read": true, 00:35:49.660 "write": true, 00:35:49.660 "unmap": true, 00:35:49.660 "flush": true, 00:35:49.660 "reset": true, 00:35:49.660 "nvme_admin": true, 00:35:49.660 "nvme_io": true, 00:35:49.660 "nvme_io_md": false, 00:35:49.660 "write_zeroes": true, 00:35:49.660 "zcopy": false, 00:35:49.660 "get_zone_info": false, 00:35:49.660 "zone_management": false, 00:35:49.660 "zone_append": false, 00:35:49.660 "compare": false, 00:35:49.660 "compare_and_write": false, 00:35:49.660 "abort": true, 00:35:49.660 "seek_hole": false, 00:35:49.660 "seek_data": false, 00:35:49.660 "copy": false, 00:35:49.660 "nvme_iov_md": false 00:35:49.660 }, 00:35:49.660 "driver_specific": { 00:35:49.660 "nvme": [ 00:35:49.660 { 00:35:49.660 "pci_address": "0000:5e:00.0", 00:35:49.660 "trid": { 00:35:49.660 "trtype": "PCIe", 00:35:49.660 "traddr": "0000:5e:00.0" 00:35:49.660 }, 00:35:49.660 "ctrlr_data": { 00:35:49.660 "cntlid": 0, 00:35:49.660 "vendor_id": "0x8086", 00:35:49.660 "model_number": "INTEL SSDPF2KX076TZO", 00:35:49.660 "serial_number": "PHAC0301002G7P6CGN", 00:35:49.660 "firmware_revision": "JCV10200", 00:35:49.660 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:35:49.660 "oacs": { 00:35:49.660 "security": 1, 00:35:49.660 "format": 1, 00:35:49.660 "firmware": 1, 00:35:49.660 "ns_manage": 1 00:35:49.660 }, 00:35:49.660 "multi_ctrlr": false, 00:35:49.660 "ana_reporting": false 00:35:49.660 }, 00:35:49.660 "vs": { 00:35:49.660 "nvme_version": "1.3" 00:35:49.660 }, 00:35:49.660 "ns_data": { 00:35:49.660 "id": 1, 00:35:49.660 "can_share": false 00:35:49.660 }, 00:35:49.660 "security": { 00:35:49.660 "opal": true 00:35:49.660 } 00:35:49.660 } 00:35:49.660 ], 00:35:49.660 "mp_policy": "active_passive" 00:35:49.660 } 00:35:49.660 } 00:35:49.660 ] 00:35:49.660 17:29:44 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:49.660 17:29:44 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:35:49.920 [2024-07-23 17:29:45.186765] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a73850 PMD being used: compress_qat 00:35:52.455 402ad479-84c1-4135-878e-1d69e375525a 00:35:52.455 17:29:47 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:35:52.455 aee878e1-445b-460c-baba-10a87e411eb9 00:35:52.455 17:29:47 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:35:52.455 17:29:47 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:35:52.455 17:29:47 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:52.455 17:29:47 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:52.455 17:29:47 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:52.455 17:29:47 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:52.455 17:29:47 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:52.715 17:29:47 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:35:52.715 [ 00:35:52.715 { 00:35:52.715 "name": "aee878e1-445b-460c-baba-10a87e411eb9", 00:35:52.715 "aliases": [ 00:35:52.715 "lvs0/lv0" 00:35:52.715 ], 00:35:52.715 "product_name": "Logical Volume", 00:35:52.715 "block_size": 512, 00:35:52.715 "num_blocks": 204800, 00:35:52.715 "uuid": "aee878e1-445b-460c-baba-10a87e411eb9", 00:35:52.715 "assigned_rate_limits": { 00:35:52.715 "rw_ios_per_sec": 0, 00:35:52.715 "rw_mbytes_per_sec": 0, 00:35:52.715 "r_mbytes_per_sec": 0, 00:35:52.715 "w_mbytes_per_sec": 0 00:35:52.715 }, 00:35:52.715 "claimed": false, 00:35:52.715 "zoned": false, 00:35:52.715 "supported_io_types": { 00:35:52.715 "read": true, 00:35:52.715 "write": true, 00:35:52.715 "unmap": true, 00:35:52.715 "flush": false, 00:35:52.715 "reset": true, 00:35:52.715 "nvme_admin": false, 00:35:52.715 "nvme_io": false, 00:35:52.715 "nvme_io_md": false, 00:35:52.715 "write_zeroes": true, 00:35:52.715 "zcopy": false, 00:35:52.715 "get_zone_info": false, 00:35:52.715 "zone_management": false, 00:35:52.715 "zone_append": false, 00:35:52.715 "compare": false, 00:35:52.715 "compare_and_write": false, 00:35:52.715 "abort": false, 00:35:52.715 "seek_hole": true, 00:35:52.715 "seek_data": true, 00:35:52.715 "copy": false, 00:35:52.715 "nvme_iov_md": false 00:35:52.715 }, 00:35:52.715 "driver_specific": { 00:35:52.715 "lvol": { 00:35:52.715 "lvol_store_uuid": "402ad479-84c1-4135-878e-1d69e375525a", 00:35:52.715 "base_bdev": "Nvme0n1", 00:35:52.715 "thin_provision": true, 00:35:52.715 "num_allocated_clusters": 0, 00:35:52.715 "snapshot": false, 00:35:52.715 "clone": false, 00:35:52.715 "esnap_clone": false 00:35:52.715 } 00:35:52.715 } 00:35:52.715 } 00:35:52.715 ] 00:35:52.715 17:29:48 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:52.715 17:29:48 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:35:52.715 17:29:48 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:35:52.974 [2024-07-23 17:29:48.359714] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:35:52.974 COMP_lvs0/lv0 00:35:52.974 17:29:48 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:35:52.975 17:29:48 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:35:52.975 17:29:48 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:52.975 17:29:48 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:35:52.975 17:29:48 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:52.975 17:29:48 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:52.975 17:29:48 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:35:53.234 17:29:48 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:35:53.493 [ 00:35:53.493 { 00:35:53.493 "name": "COMP_lvs0/lv0", 00:35:53.493 "aliases": [ 00:35:53.493 "1adb9122-d5d4-5027-8911-7146bf7e7b83" 00:35:53.493 ], 00:35:53.493 "product_name": "compress", 00:35:53.493 "block_size": 512, 00:35:53.493 "num_blocks": 200704, 00:35:53.493 "uuid": "1adb9122-d5d4-5027-8911-7146bf7e7b83", 00:35:53.493 "assigned_rate_limits": { 00:35:53.493 "rw_ios_per_sec": 0, 00:35:53.493 "rw_mbytes_per_sec": 0, 00:35:53.493 "r_mbytes_per_sec": 0, 00:35:53.493 "w_mbytes_per_sec": 0 00:35:53.493 }, 00:35:53.493 "claimed": false, 00:35:53.493 "zoned": false, 00:35:53.493 "supported_io_types": { 00:35:53.493 "read": true, 00:35:53.493 "write": true, 00:35:53.493 "unmap": false, 00:35:53.493 "flush": false, 00:35:53.493 "reset": false, 00:35:53.493 "nvme_admin": false, 00:35:53.493 "nvme_io": false, 00:35:53.493 "nvme_io_md": false, 00:35:53.493 "write_zeroes": true, 00:35:53.493 "zcopy": false, 00:35:53.493 "get_zone_info": false, 00:35:53.493 "zone_management": false, 00:35:53.493 "zone_append": false, 00:35:53.493 "compare": false, 00:35:53.493 "compare_and_write": false, 00:35:53.493 "abort": false, 00:35:53.493 "seek_hole": false, 00:35:53.493 "seek_data": false, 00:35:53.493 "copy": false, 00:35:53.493 "nvme_iov_md": false 00:35:53.493 }, 00:35:53.493 "driver_specific": { 00:35:53.493 "compress": { 00:35:53.493 "name": "COMP_lvs0/lv0", 00:35:53.493 "base_bdev_name": "aee878e1-445b-460c-baba-10a87e411eb9", 00:35:53.493 "pm_path": "/tmp/pmem/f6b46157-74ce-4efd-add6-a8021b0619cb" 00:35:53.493 } 00:35:53.493 } 00:35:53.493 } 00:35:53.493 ] 00:35:53.753 17:29:48 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:35:53.753 17:29:48 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:53.753 [2024-07-23 17:29:49.024551] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c25400 PMD being used: compress_qat 00:35:53.753 [2024-07-23 17:29:49.028747] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1bd819bc10 PMD being used: compress_qat 00:35:53.753 Running I/O for 30 seconds... 00:36:25.839 00:36:25.839 Latency(us) 00:36:25.839 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:25.839 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 64, IO size: 16384) 00:36:25.839 Verification LBA range: start 0x0 length 0xc40 00:36:25.839 COMP_lvs0/lv0 : 30.01 3384.48 52.88 0.00 0.00 18742.41 2350.75 45590.26 00:36:25.839 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 64, IO size: 16384) 00:36:25.839 Verification LBA range: start 0xc40 length 0xc40 00:36:25.839 COMP_lvs0/lv0 : 30.02 847.53 13.24 0.00 0.00 75088.12 1289.35 65194.07 00:36:25.839 =================================================================================================================== 00:36:25.839 Total : 4232.02 66.13 0.00 0.00 30027.87 1289.35 65194.07 00:36:25.839 0 00:36:25.839 17:30:19 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:36:25.839 17:30:19 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:36:25.839 17:30:19 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:36:25.839 17:30:19 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:36:25.839 17:30:19 compress_compdev -- compress/compress.sh@78 -- # killprocess 95782 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 95782 ']' 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 95782 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 95782 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 95782' 00:36:25.839 killing process with pid 95782 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@967 -- # kill 95782 00:36:25.839 Received shutdown signal, test time was about 30.000000 seconds 00:36:25.839 00:36:25.839 Latency(us) 00:36:25.839 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:25.839 =================================================================================================================== 00:36:25.839 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:25.839 17:30:19 compress_compdev -- common/autotest_common.sh@972 -- # wait 95782 00:36:27.745 17:30:22 compress_compdev -- compress/compress.sh@95 -- # export TEST_TRANSPORT=tcp 00:36:27.745 17:30:22 compress_compdev -- compress/compress.sh@95 -- # TEST_TRANSPORT=tcp 00:36:27.745 17:30:22 compress_compdev -- compress/compress.sh@96 -- # NET_TYPE=virt 00:36:27.745 17:30:22 compress_compdev -- compress/compress.sh@96 -- # nvmftestinit 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@448 -- # prepare_net_devs 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@410 -- # local -g is_hw=no 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@412 -- # remove_spdk_ns 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:27.745 17:30:22 compress_compdev -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:27.745 17:30:22 compress_compdev -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@414 -- # [[ virt != virt ]] 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@423 -- # [[ virt == phy ]] 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@426 -- # [[ virt == phy-fallback ]] 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:27.745 Cannot find device "nvmf_tgt_br" 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@155 -- # true 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:27.745 Cannot find device "nvmf_tgt_br2" 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@156 -- # true 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:27.745 Cannot find device "nvmf_tgt_br" 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@158 -- # true 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:27.745 Cannot find device "nvmf_tgt_br2" 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@159 -- # true 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:27.745 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@162 -- # true 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:27.745 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@163 -- # true 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:27.745 17:30:22 compress_compdev -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:27.745 17:30:23 compress_compdev -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:28.004 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:28.004 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.090 ms 00:36:28.004 00:36:28.004 --- 10.0.0.2 ping statistics --- 00:36:28.004 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:28.004 rtt min/avg/max/mdev = 0.090/0.090/0.090/0.000 ms 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:28.004 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:28.004 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.073 ms 00:36:28.004 00:36:28.004 --- 10.0.0.3 ping statistics --- 00:36:28.004 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:28.004 rtt min/avg/max/mdev = 0.073/0.073/0.073/0.000 ms 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:28.004 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:28.004 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.047 ms 00:36:28.004 00:36:28.004 --- 10.0.0.1 ping statistics --- 00:36:28.004 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:28.004 rtt min/avg/max/mdev = 0.047/0.047/0.047/0.000 ms 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@433 -- # return 0 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:28.004 17:30:23 compress_compdev -- compress/compress.sh@97 -- # nvmfappstart -m 0x7 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@481 -- # nvmfpid=101613 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@482 -- # waitforlisten 101613 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 101613 ']' 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:28.004 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:28.004 17:30:23 compress_compdev -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:28.004 17:30:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:36:28.004 [2024-07-23 17:30:23.414500] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:36:28.004 [2024-07-23 17:30:23.414554] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:28.262 [2024-07-23 17:30:23.533107] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:36:28.262 [2024-07-23 17:30:23.586185] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:28.262 [2024-07-23 17:30:23.586235] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:28.262 [2024-07-23 17:30:23.586251] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:28.262 [2024-07-23 17:30:23.586264] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:28.262 [2024-07-23 17:30:23.586275] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:28.262 [2024-07-23 17:30:23.586451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:28.262 [2024-07-23 17:30:23.586551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:36:28.262 [2024-07-23 17:30:23.586553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:29.263 17:30:24 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:29.263 17:30:24 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:36:29.263 17:30:24 compress_compdev -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:29.263 17:30:24 compress_compdev -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:29.263 17:30:24 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:36:29.263 17:30:24 compress_compdev -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:29.263 17:30:24 compress_compdev -- compress/compress.sh@98 -- # trap 'nvmftestfini; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:36:29.263 17:30:24 compress_compdev -- compress/compress.sh@101 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t tcp -u 8192 00:36:29.263 [2024-07-23 17:30:24.640968] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:29.263 17:30:24 compress_compdev -- compress/compress.sh@102 -- # create_vols 00:36:29.263 17:30:24 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:36:29.263 17:30:24 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:36:30.198 17:30:25 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:36:30.198 17:30:25 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:36:30.456 [ 00:36:30.456 { 00:36:30.456 "name": "Nvme0n1", 00:36:30.456 "aliases": [ 00:36:30.456 "01000000-0000-0000-5cd2-e43197705251" 00:36:30.456 ], 00:36:30.456 "product_name": "NVMe disk", 00:36:30.456 "block_size": 512, 00:36:30.456 "num_blocks": 15002931888, 00:36:30.456 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:36:30.456 "assigned_rate_limits": { 00:36:30.456 "rw_ios_per_sec": 0, 00:36:30.456 "rw_mbytes_per_sec": 0, 00:36:30.456 "r_mbytes_per_sec": 0, 00:36:30.456 "w_mbytes_per_sec": 0 00:36:30.456 }, 00:36:30.456 "claimed": false, 00:36:30.456 "zoned": false, 00:36:30.456 "supported_io_types": { 00:36:30.456 "read": true, 00:36:30.456 "write": true, 00:36:30.456 "unmap": true, 00:36:30.456 "flush": true, 00:36:30.456 "reset": true, 00:36:30.456 "nvme_admin": true, 00:36:30.456 "nvme_io": true, 00:36:30.456 "nvme_io_md": false, 00:36:30.456 "write_zeroes": true, 00:36:30.456 "zcopy": false, 00:36:30.456 "get_zone_info": false, 00:36:30.456 "zone_management": false, 00:36:30.456 "zone_append": false, 00:36:30.456 "compare": false, 00:36:30.456 "compare_and_write": false, 00:36:30.456 "abort": true, 00:36:30.456 "seek_hole": false, 00:36:30.456 "seek_data": false, 00:36:30.456 "copy": false, 00:36:30.456 "nvme_iov_md": false 00:36:30.456 }, 00:36:30.456 "driver_specific": { 00:36:30.456 "nvme": [ 00:36:30.456 { 00:36:30.456 "pci_address": "0000:5e:00.0", 00:36:30.456 "trid": { 00:36:30.456 "trtype": "PCIe", 00:36:30.456 "traddr": "0000:5e:00.0" 00:36:30.456 }, 00:36:30.456 "ctrlr_data": { 00:36:30.456 "cntlid": 0, 00:36:30.456 "vendor_id": "0x8086", 00:36:30.456 "model_number": "INTEL SSDPF2KX076TZO", 00:36:30.456 "serial_number": "PHAC0301002G7P6CGN", 00:36:30.456 "firmware_revision": "JCV10200", 00:36:30.457 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:36:30.457 "oacs": { 00:36:30.457 "security": 1, 00:36:30.457 "format": 1, 00:36:30.457 "firmware": 1, 00:36:30.457 "ns_manage": 1 00:36:30.457 }, 00:36:30.457 "multi_ctrlr": false, 00:36:30.457 "ana_reporting": false 00:36:30.457 }, 00:36:30.457 "vs": { 00:36:30.457 "nvme_version": "1.3" 00:36:30.457 }, 00:36:30.457 "ns_data": { 00:36:30.457 "id": 1, 00:36:30.457 "can_share": false 00:36:30.457 }, 00:36:30.457 "security": { 00:36:30.457 "opal": true 00:36:30.457 } 00:36:30.457 } 00:36:30.457 ], 00:36:30.457 "mp_policy": "active_passive" 00:36:30.457 } 00:36:30.457 } 00:36:30.457 ] 00:36:30.457 17:30:25 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:36:30.457 17:30:25 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:36:32.987 0741d7fb-4e54-4a83-8a36-bcd0d7242ed1 00:36:32.987 17:30:28 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:36:33.245 bca67e65-0f4f-4ca6-af42-b595879d79a8 00:36:33.245 17:30:28 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:36:33.245 17:30:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:36:33.245 17:30:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:36:33.245 17:30:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:36:33.245 17:30:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:36:33.245 17:30:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:36:33.245 17:30:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:36:33.504 17:30:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:36:33.762 [ 00:36:33.762 { 00:36:33.762 "name": "bca67e65-0f4f-4ca6-af42-b595879d79a8", 00:36:33.762 "aliases": [ 00:36:33.762 "lvs0/lv0" 00:36:33.762 ], 00:36:33.762 "product_name": "Logical Volume", 00:36:33.762 "block_size": 512, 00:36:33.762 "num_blocks": 204800, 00:36:33.762 "uuid": "bca67e65-0f4f-4ca6-af42-b595879d79a8", 00:36:33.762 "assigned_rate_limits": { 00:36:33.762 "rw_ios_per_sec": 0, 00:36:33.762 "rw_mbytes_per_sec": 0, 00:36:33.762 "r_mbytes_per_sec": 0, 00:36:33.762 "w_mbytes_per_sec": 0 00:36:33.762 }, 00:36:33.762 "claimed": false, 00:36:33.762 "zoned": false, 00:36:33.762 "supported_io_types": { 00:36:33.762 "read": true, 00:36:33.762 "write": true, 00:36:33.762 "unmap": true, 00:36:33.762 "flush": false, 00:36:33.762 "reset": true, 00:36:33.762 "nvme_admin": false, 00:36:33.762 "nvme_io": false, 00:36:33.762 "nvme_io_md": false, 00:36:33.762 "write_zeroes": true, 00:36:33.762 "zcopy": false, 00:36:33.762 "get_zone_info": false, 00:36:33.762 "zone_management": false, 00:36:33.762 "zone_append": false, 00:36:33.762 "compare": false, 00:36:33.762 "compare_and_write": false, 00:36:33.762 "abort": false, 00:36:33.762 "seek_hole": true, 00:36:33.762 "seek_data": true, 00:36:33.762 "copy": false, 00:36:33.762 "nvme_iov_md": false 00:36:33.762 }, 00:36:33.762 "driver_specific": { 00:36:33.762 "lvol": { 00:36:33.762 "lvol_store_uuid": "0741d7fb-4e54-4a83-8a36-bcd0d7242ed1", 00:36:33.762 "base_bdev": "Nvme0n1", 00:36:33.762 "thin_provision": true, 00:36:33.762 "num_allocated_clusters": 0, 00:36:33.762 "snapshot": false, 00:36:33.762 "clone": false, 00:36:33.762 "esnap_clone": false 00:36:33.762 } 00:36:33.762 } 00:36:33.762 } 00:36:33.762 ] 00:36:33.762 17:30:29 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:36:33.762 17:30:29 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:36:33.762 17:30:29 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:36:34.020 [2024-07-23 17:30:29.335382] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:36:34.020 COMP_lvs0/lv0 00:36:34.020 17:30:29 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:36:34.020 17:30:29 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:36:34.020 17:30:29 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:36:34.020 17:30:29 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:36:34.020 17:30:29 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:36:34.020 17:30:29 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:36:34.020 17:30:29 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:36:34.278 17:30:29 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:36:34.536 [ 00:36:34.536 { 00:36:34.536 "name": "COMP_lvs0/lv0", 00:36:34.536 "aliases": [ 00:36:34.536 "066d284b-797a-5094-94b6-dbae082f3c32" 00:36:34.536 ], 00:36:34.536 "product_name": "compress", 00:36:34.536 "block_size": 512, 00:36:34.536 "num_blocks": 200704, 00:36:34.536 "uuid": "066d284b-797a-5094-94b6-dbae082f3c32", 00:36:34.536 "assigned_rate_limits": { 00:36:34.536 "rw_ios_per_sec": 0, 00:36:34.536 "rw_mbytes_per_sec": 0, 00:36:34.536 "r_mbytes_per_sec": 0, 00:36:34.536 "w_mbytes_per_sec": 0 00:36:34.536 }, 00:36:34.536 "claimed": false, 00:36:34.536 "zoned": false, 00:36:34.536 "supported_io_types": { 00:36:34.536 "read": true, 00:36:34.536 "write": true, 00:36:34.536 "unmap": false, 00:36:34.536 "flush": false, 00:36:34.536 "reset": false, 00:36:34.536 "nvme_admin": false, 00:36:34.536 "nvme_io": false, 00:36:34.536 "nvme_io_md": false, 00:36:34.536 "write_zeroes": true, 00:36:34.536 "zcopy": false, 00:36:34.536 "get_zone_info": false, 00:36:34.536 "zone_management": false, 00:36:34.536 "zone_append": false, 00:36:34.536 "compare": false, 00:36:34.536 "compare_and_write": false, 00:36:34.536 "abort": false, 00:36:34.536 "seek_hole": false, 00:36:34.536 "seek_data": false, 00:36:34.536 "copy": false, 00:36:34.536 "nvme_iov_md": false 00:36:34.536 }, 00:36:34.536 "driver_specific": { 00:36:34.536 "compress": { 00:36:34.536 "name": "COMP_lvs0/lv0", 00:36:34.537 "base_bdev_name": "bca67e65-0f4f-4ca6-af42-b595879d79a8", 00:36:34.537 "pm_path": "/tmp/pmem/ec392d26-f055-4b87-842d-ae283b1b6c95" 00:36:34.537 } 00:36:34.537 } 00:36:34.537 } 00:36:34.537 ] 00:36:34.537 17:30:29 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:36:34.537 17:30:29 compress_compdev -- compress/compress.sh@103 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:36:34.795 17:30:30 compress_compdev -- compress/compress.sh@104 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 COMP_lvs0/lv0 00:36:35.053 17:30:30 compress_compdev -- compress/compress.sh@105 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t tcp -a 10.0.0.2 -s 4420 00:36:35.311 [2024-07-23 17:30:30.580909] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:35.311 17:30:30 compress_compdev -- compress/compress.sh@108 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:tcp adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 64 -s 512 -w randrw -t 30 -c 0x18 -M 50 00:36:35.311 17:30:30 compress_compdev -- compress/compress.sh@109 -- # perf_pid=102539 00:36:35.311 17:30:30 compress_compdev -- compress/compress.sh@112 -- # trap 'killprocess $perf_pid; compress_err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:36:35.311 17:30:30 compress_compdev -- compress/compress.sh@113 -- # wait 102539 00:36:35.569 [2024-07-23 17:30:30.890914] subsystem.c:1572:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on TCP/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:37:07.649 Initializing NVMe Controllers 00:37:07.649 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:37:07.649 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:37:07.649 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:37:07.649 Initialization complete. Launching workers. 00:37:07.649 ======================================================== 00:37:07.649 Latency(us) 00:37:07.649 Device Information : IOPS MiB/s Average min max 00:37:07.649 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 3727.03 14.56 17174.57 2224.84 37977.72 00:37:07.649 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 2320.93 9.07 27582.67 2205.41 51786.99 00:37:07.649 ======================================================== 00:37:07.649 Total : 6047.97 23.62 21168.72 2205.41 51786.99 00:37:07.649 00:37:07.649 17:31:01 compress_compdev -- compress/compress.sh@114 -- # destroy_vols 00:37:07.649 17:31:01 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:37:07.649 17:31:01 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:37:07.649 17:31:01 compress_compdev -- compress/compress.sh@116 -- # trap - SIGINT SIGTERM EXIT 00:37:07.649 17:31:01 compress_compdev -- compress/compress.sh@117 -- # nvmftestfini 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@488 -- # nvmfcleanup 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@117 -- # sync 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@120 -- # set +e 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@121 -- # for i in {1..20} 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:37:07.649 rmmod nvme_tcp 00:37:07.649 rmmod nvme_fabrics 00:37:07.649 rmmod nvme_keyring 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@124 -- # set -e 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@125 -- # return 0 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@489 -- # '[' -n 101613 ']' 00:37:07.649 17:31:01 compress_compdev -- nvmf/common.sh@490 -- # killprocess 101613 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 101613 ']' 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 101613 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 101613 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 101613' 00:37:07.649 killing process with pid 101613 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@967 -- # kill 101613 00:37:07.649 17:31:01 compress_compdev -- common/autotest_common.sh@972 -- # wait 101613 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@278 -- # remove_spdk_ns 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:37:09.554 17:31:04 compress_compdev -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:37:09.554 17:31:04 compress_compdev -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:37:09.554 17:31:04 compress_compdev -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:37:09.554 17:31:04 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:37:09.554 00:37:09.554 real 2m10.243s 00:37:09.554 user 6m1.691s 00:37:09.554 sys 0m23.317s 00:37:09.554 17:31:04 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:09.554 17:31:04 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:37:09.554 ************************************ 00:37:09.554 END TEST compress_compdev 00:37:09.554 ************************************ 00:37:09.554 17:31:04 -- common/autotest_common.sh@1142 -- # return 0 00:37:09.554 17:31:04 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:37:09.554 17:31:04 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:37:09.554 17:31:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:09.554 17:31:04 -- common/autotest_common.sh@10 -- # set +x 00:37:09.554 ************************************ 00:37:09.554 START TEST compress_isal 00:37:09.554 ************************************ 00:37:09.554 17:31:04 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:37:09.554 * Looking for test storage... 00:37:09.554 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:37:09.554 17:31:04 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:37:09.554 17:31:04 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:09.555 17:31:04 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:37:09.555 17:31:04 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:09.555 17:31:04 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:09.555 17:31:04 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:09.555 17:31:04 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:09.555 17:31:04 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:09.555 17:31:04 compress_isal -- paths/export.sh@5 -- # export PATH 00:37:09.555 17:31:04 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@47 -- # : 0 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:37:09.555 17:31:04 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=106955 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@73 -- # waitforlisten 106955 00:37:09.555 17:31:04 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:37:09.555 17:31:04 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 106955 ']' 00:37:09.555 17:31:04 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:09.555 17:31:04 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:09.555 17:31:04 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:09.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:09.555 17:31:04 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:09.555 17:31:04 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:37:09.555 [2024-07-23 17:31:04.959345] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:37:09.555 [2024-07-23 17:31:04.959417] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid106955 ] 00:37:09.813 [2024-07-23 17:31:05.096775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:09.813 [2024-07-23 17:31:05.154544] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:37:09.813 [2024-07-23 17:31:05.154551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:10.746 17:31:05 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:10.746 17:31:05 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:37:10.746 17:31:05 compress_isal -- compress/compress.sh@74 -- # create_vols 00:37:10.746 17:31:05 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:37:10.746 17:31:05 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:37:11.315 17:31:06 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:37:11.315 17:31:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:37:11.315 17:31:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:11.315 17:31:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:11.315 17:31:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:11.315 17:31:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:11.315 17:31:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:11.574 17:31:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:37:11.834 [ 00:37:11.834 { 00:37:11.834 "name": "Nvme0n1", 00:37:11.834 "aliases": [ 00:37:11.834 "01000000-0000-0000-5cd2-e43197705251" 00:37:11.834 ], 00:37:11.834 "product_name": "NVMe disk", 00:37:11.834 "block_size": 512, 00:37:11.834 "num_blocks": 15002931888, 00:37:11.834 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:37:11.834 "assigned_rate_limits": { 00:37:11.834 "rw_ios_per_sec": 0, 00:37:11.834 "rw_mbytes_per_sec": 0, 00:37:11.834 "r_mbytes_per_sec": 0, 00:37:11.834 "w_mbytes_per_sec": 0 00:37:11.834 }, 00:37:11.834 "claimed": false, 00:37:11.834 "zoned": false, 00:37:11.834 "supported_io_types": { 00:37:11.834 "read": true, 00:37:11.834 "write": true, 00:37:11.834 "unmap": true, 00:37:11.834 "flush": true, 00:37:11.834 "reset": true, 00:37:11.834 "nvme_admin": true, 00:37:11.834 "nvme_io": true, 00:37:11.834 "nvme_io_md": false, 00:37:11.834 "write_zeroes": true, 00:37:11.834 "zcopy": false, 00:37:11.834 "get_zone_info": false, 00:37:11.834 "zone_management": false, 00:37:11.834 "zone_append": false, 00:37:11.834 "compare": false, 00:37:11.834 "compare_and_write": false, 00:37:11.834 "abort": true, 00:37:11.834 "seek_hole": false, 00:37:11.834 "seek_data": false, 00:37:11.834 "copy": false, 00:37:11.834 "nvme_iov_md": false 00:37:11.834 }, 00:37:11.834 "driver_specific": { 00:37:11.834 "nvme": [ 00:37:11.834 { 00:37:11.834 "pci_address": "0000:5e:00.0", 00:37:11.834 "trid": { 00:37:11.834 "trtype": "PCIe", 00:37:11.834 "traddr": "0000:5e:00.0" 00:37:11.834 }, 00:37:11.834 "ctrlr_data": { 00:37:11.834 "cntlid": 0, 00:37:11.834 "vendor_id": "0x8086", 00:37:11.834 "model_number": "INTEL SSDPF2KX076TZO", 00:37:11.834 "serial_number": "PHAC0301002G7P6CGN", 00:37:11.834 "firmware_revision": "JCV10200", 00:37:11.834 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:37:11.834 "oacs": { 00:37:11.834 "security": 1, 00:37:11.834 "format": 1, 00:37:11.834 "firmware": 1, 00:37:11.834 "ns_manage": 1 00:37:11.834 }, 00:37:11.834 "multi_ctrlr": false, 00:37:11.834 "ana_reporting": false 00:37:11.834 }, 00:37:11.834 "vs": { 00:37:11.834 "nvme_version": "1.3" 00:37:11.834 }, 00:37:11.834 "ns_data": { 00:37:11.834 "id": 1, 00:37:11.834 "can_share": false 00:37:11.834 }, 00:37:11.834 "security": { 00:37:11.834 "opal": true 00:37:11.834 } 00:37:11.834 } 00:37:11.834 ], 00:37:11.834 "mp_policy": "active_passive" 00:37:11.834 } 00:37:11.834 } 00:37:11.834 ] 00:37:11.834 17:31:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:11.834 17:31:07 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:37:14.372 39f6bde6-d6b3-4480-a47a-e28e58cb792e 00:37:14.372 17:31:09 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:37:14.372 4c369070-0377-4933-903a-139b9437662f 00:37:14.372 17:31:09 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:37:14.372 17:31:09 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:37:14.372 17:31:09 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:14.372 17:31:09 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:14.372 17:31:09 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:14.372 17:31:09 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:14.372 17:31:09 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:14.631 17:31:09 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:37:14.891 [ 00:37:14.891 { 00:37:14.891 "name": "4c369070-0377-4933-903a-139b9437662f", 00:37:14.891 "aliases": [ 00:37:14.891 "lvs0/lv0" 00:37:14.891 ], 00:37:14.891 "product_name": "Logical Volume", 00:37:14.891 "block_size": 512, 00:37:14.891 "num_blocks": 204800, 00:37:14.891 "uuid": "4c369070-0377-4933-903a-139b9437662f", 00:37:14.891 "assigned_rate_limits": { 00:37:14.891 "rw_ios_per_sec": 0, 00:37:14.891 "rw_mbytes_per_sec": 0, 00:37:14.891 "r_mbytes_per_sec": 0, 00:37:14.891 "w_mbytes_per_sec": 0 00:37:14.891 }, 00:37:14.891 "claimed": false, 00:37:14.891 "zoned": false, 00:37:14.891 "supported_io_types": { 00:37:14.891 "read": true, 00:37:14.891 "write": true, 00:37:14.891 "unmap": true, 00:37:14.891 "flush": false, 00:37:14.891 "reset": true, 00:37:14.891 "nvme_admin": false, 00:37:14.891 "nvme_io": false, 00:37:14.891 "nvme_io_md": false, 00:37:14.891 "write_zeroes": true, 00:37:14.891 "zcopy": false, 00:37:14.891 "get_zone_info": false, 00:37:14.891 "zone_management": false, 00:37:14.891 "zone_append": false, 00:37:14.891 "compare": false, 00:37:14.891 "compare_and_write": false, 00:37:14.891 "abort": false, 00:37:14.891 "seek_hole": true, 00:37:14.891 "seek_data": true, 00:37:14.891 "copy": false, 00:37:14.891 "nvme_iov_md": false 00:37:14.891 }, 00:37:14.891 "driver_specific": { 00:37:14.891 "lvol": { 00:37:14.891 "lvol_store_uuid": "39f6bde6-d6b3-4480-a47a-e28e58cb792e", 00:37:14.891 "base_bdev": "Nvme0n1", 00:37:14.891 "thin_provision": true, 00:37:14.891 "num_allocated_clusters": 0, 00:37:14.891 "snapshot": false, 00:37:14.891 "clone": false, 00:37:14.891 "esnap_clone": false 00:37:14.891 } 00:37:14.891 } 00:37:14.891 } 00:37:14.891 ] 00:37:14.891 17:31:10 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:14.891 17:31:10 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:37:14.891 17:31:10 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:37:15.150 [2024-07-23 17:31:10.418101] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:37:15.150 COMP_lvs0/lv0 00:37:15.150 17:31:10 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:37:15.150 17:31:10 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:37:15.150 17:31:10 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:15.150 17:31:10 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:15.150 17:31:10 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:15.150 17:31:10 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:15.150 17:31:10 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:15.443 17:31:10 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:37:15.721 [ 00:37:15.721 { 00:37:15.721 "name": "COMP_lvs0/lv0", 00:37:15.721 "aliases": [ 00:37:15.721 "6959e650-73a2-5ec9-865b-37c97abd44a1" 00:37:15.721 ], 00:37:15.721 "product_name": "compress", 00:37:15.721 "block_size": 512, 00:37:15.721 "num_blocks": 200704, 00:37:15.721 "uuid": "6959e650-73a2-5ec9-865b-37c97abd44a1", 00:37:15.721 "assigned_rate_limits": { 00:37:15.721 "rw_ios_per_sec": 0, 00:37:15.721 "rw_mbytes_per_sec": 0, 00:37:15.721 "r_mbytes_per_sec": 0, 00:37:15.721 "w_mbytes_per_sec": 0 00:37:15.721 }, 00:37:15.721 "claimed": false, 00:37:15.721 "zoned": false, 00:37:15.721 "supported_io_types": { 00:37:15.721 "read": true, 00:37:15.721 "write": true, 00:37:15.721 "unmap": false, 00:37:15.721 "flush": false, 00:37:15.721 "reset": false, 00:37:15.721 "nvme_admin": false, 00:37:15.721 "nvme_io": false, 00:37:15.721 "nvme_io_md": false, 00:37:15.721 "write_zeroes": true, 00:37:15.721 "zcopy": false, 00:37:15.721 "get_zone_info": false, 00:37:15.721 "zone_management": false, 00:37:15.721 "zone_append": false, 00:37:15.721 "compare": false, 00:37:15.721 "compare_and_write": false, 00:37:15.721 "abort": false, 00:37:15.721 "seek_hole": false, 00:37:15.721 "seek_data": false, 00:37:15.721 "copy": false, 00:37:15.721 "nvme_iov_md": false 00:37:15.721 }, 00:37:15.721 "driver_specific": { 00:37:15.721 "compress": { 00:37:15.721 "name": "COMP_lvs0/lv0", 00:37:15.721 "base_bdev_name": "4c369070-0377-4933-903a-139b9437662f", 00:37:15.721 "pm_path": "/tmp/pmem/725059ab-3a04-48ae-bd6e-cc4c4ca0d4b0" 00:37:15.721 } 00:37:15.721 } 00:37:15.721 } 00:37:15.721 ] 00:37:15.721 17:31:10 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:15.721 17:31:10 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:37:15.721 Running I/O for 3 seconds... 00:37:19.011 00:37:19.011 Latency(us) 00:37:19.011 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:19.011 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:37:19.011 Verification LBA range: start 0x0 length 0x3100 00:37:19.011 COMP_lvs0/lv0 : 3.01 1265.44 4.94 0.00 0.00 25178.22 2080.06 21655.37 00:37:19.011 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:37:19.011 Verification LBA range: start 0x3100 length 0x3100 00:37:19.011 COMP_lvs0/lv0 : 3.01 1266.76 4.95 0.00 0.00 25116.77 1538.67 20287.67 00:37:19.011 =================================================================================================================== 00:37:19.011 Total : 2532.20 9.89 0.00 0.00 25147.48 1538.67 21655.37 00:37:19.011 0 00:37:19.011 17:31:14 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:37:19.011 17:31:14 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:37:19.011 17:31:14 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:37:19.270 17:31:14 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:37:19.270 17:31:14 compress_isal -- compress/compress.sh@78 -- # killprocess 106955 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 106955 ']' 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@952 -- # kill -0 106955 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@953 -- # uname 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 106955 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 106955' 00:37:19.270 killing process with pid 106955 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@967 -- # kill 106955 00:37:19.270 Received shutdown signal, test time was about 3.000000 seconds 00:37:19.270 00:37:19.270 Latency(us) 00:37:19.270 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:19.270 =================================================================================================================== 00:37:19.270 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:19.270 17:31:14 compress_isal -- common/autotest_common.sh@972 -- # wait 106955 00:37:22.561 17:31:17 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:37:22.561 17:31:17 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:37:22.561 17:31:17 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=108561 00:37:22.561 17:31:17 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:37:22.561 17:31:17 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:37:22.561 17:31:17 compress_isal -- compress/compress.sh@73 -- # waitforlisten 108561 00:37:22.561 17:31:17 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 108561 ']' 00:37:22.561 17:31:17 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:22.561 17:31:17 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:22.561 17:31:17 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:22.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:22.561 17:31:17 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:22.561 17:31:17 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:37:22.561 [2024-07-23 17:31:17.765513] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:37:22.561 [2024-07-23 17:31:17.765586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid108561 ] 00:37:22.561 [2024-07-23 17:31:17.901584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:22.561 [2024-07-23 17:31:17.969170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:37:22.561 [2024-07-23 17:31:17.969179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:23.499 17:31:18 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:23.499 17:31:18 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:37:23.499 17:31:18 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:37:23.499 17:31:18 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:37:23.499 17:31:18 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:37:24.068 17:31:19 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:37:24.068 17:31:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:37:24.068 17:31:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:24.068 17:31:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:24.068 17:31:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:24.068 17:31:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:24.068 17:31:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:24.328 17:31:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:37:24.587 [ 00:37:24.587 { 00:37:24.587 "name": "Nvme0n1", 00:37:24.587 "aliases": [ 00:37:24.587 "01000000-0000-0000-5cd2-e43197705251" 00:37:24.587 ], 00:37:24.587 "product_name": "NVMe disk", 00:37:24.587 "block_size": 512, 00:37:24.587 "num_blocks": 15002931888, 00:37:24.587 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:37:24.587 "assigned_rate_limits": { 00:37:24.587 "rw_ios_per_sec": 0, 00:37:24.587 "rw_mbytes_per_sec": 0, 00:37:24.587 "r_mbytes_per_sec": 0, 00:37:24.587 "w_mbytes_per_sec": 0 00:37:24.587 }, 00:37:24.587 "claimed": false, 00:37:24.587 "zoned": false, 00:37:24.587 "supported_io_types": { 00:37:24.587 "read": true, 00:37:24.587 "write": true, 00:37:24.587 "unmap": true, 00:37:24.587 "flush": true, 00:37:24.587 "reset": true, 00:37:24.587 "nvme_admin": true, 00:37:24.587 "nvme_io": true, 00:37:24.588 "nvme_io_md": false, 00:37:24.588 "write_zeroes": true, 00:37:24.588 "zcopy": false, 00:37:24.588 "get_zone_info": false, 00:37:24.588 "zone_management": false, 00:37:24.588 "zone_append": false, 00:37:24.588 "compare": false, 00:37:24.588 "compare_and_write": false, 00:37:24.588 "abort": true, 00:37:24.588 "seek_hole": false, 00:37:24.588 "seek_data": false, 00:37:24.588 "copy": false, 00:37:24.588 "nvme_iov_md": false 00:37:24.588 }, 00:37:24.588 "driver_specific": { 00:37:24.588 "nvme": [ 00:37:24.588 { 00:37:24.588 "pci_address": "0000:5e:00.0", 00:37:24.588 "trid": { 00:37:24.588 "trtype": "PCIe", 00:37:24.588 "traddr": "0000:5e:00.0" 00:37:24.588 }, 00:37:24.588 "ctrlr_data": { 00:37:24.588 "cntlid": 0, 00:37:24.588 "vendor_id": "0x8086", 00:37:24.588 "model_number": "INTEL SSDPF2KX076TZO", 00:37:24.588 "serial_number": "PHAC0301002G7P6CGN", 00:37:24.588 "firmware_revision": "JCV10200", 00:37:24.588 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:37:24.588 "oacs": { 00:37:24.588 "security": 1, 00:37:24.588 "format": 1, 00:37:24.588 "firmware": 1, 00:37:24.588 "ns_manage": 1 00:37:24.588 }, 00:37:24.588 "multi_ctrlr": false, 00:37:24.588 "ana_reporting": false 00:37:24.588 }, 00:37:24.588 "vs": { 00:37:24.588 "nvme_version": "1.3" 00:37:24.588 }, 00:37:24.588 "ns_data": { 00:37:24.588 "id": 1, 00:37:24.588 "can_share": false 00:37:24.588 }, 00:37:24.588 "security": { 00:37:24.588 "opal": true 00:37:24.588 } 00:37:24.588 } 00:37:24.588 ], 00:37:24.588 "mp_policy": "active_passive" 00:37:24.588 } 00:37:24.588 } 00:37:24.588 ] 00:37:24.588 17:31:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:24.588 17:31:19 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:37:27.125 7b53a65f-c3a6-45ad-8e36-08f0ae097b29 00:37:27.125 17:31:22 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:37:27.125 b7119641-89ea-4a65-95f7-aad99cf56f12 00:37:27.125 17:31:22 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:37:27.125 17:31:22 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:37:27.125 17:31:22 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:27.125 17:31:22 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:27.125 17:31:22 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:27.125 17:31:22 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:27.125 17:31:22 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:27.384 17:31:22 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:37:27.643 [ 00:37:27.643 { 00:37:27.643 "name": "b7119641-89ea-4a65-95f7-aad99cf56f12", 00:37:27.643 "aliases": [ 00:37:27.643 "lvs0/lv0" 00:37:27.643 ], 00:37:27.643 "product_name": "Logical Volume", 00:37:27.643 "block_size": 512, 00:37:27.643 "num_blocks": 204800, 00:37:27.643 "uuid": "b7119641-89ea-4a65-95f7-aad99cf56f12", 00:37:27.643 "assigned_rate_limits": { 00:37:27.643 "rw_ios_per_sec": 0, 00:37:27.643 "rw_mbytes_per_sec": 0, 00:37:27.643 "r_mbytes_per_sec": 0, 00:37:27.643 "w_mbytes_per_sec": 0 00:37:27.643 }, 00:37:27.643 "claimed": false, 00:37:27.643 "zoned": false, 00:37:27.643 "supported_io_types": { 00:37:27.643 "read": true, 00:37:27.643 "write": true, 00:37:27.643 "unmap": true, 00:37:27.643 "flush": false, 00:37:27.643 "reset": true, 00:37:27.643 "nvme_admin": false, 00:37:27.643 "nvme_io": false, 00:37:27.643 "nvme_io_md": false, 00:37:27.643 "write_zeroes": true, 00:37:27.643 "zcopy": false, 00:37:27.643 "get_zone_info": false, 00:37:27.643 "zone_management": false, 00:37:27.643 "zone_append": false, 00:37:27.643 "compare": false, 00:37:27.644 "compare_and_write": false, 00:37:27.644 "abort": false, 00:37:27.644 "seek_hole": true, 00:37:27.644 "seek_data": true, 00:37:27.644 "copy": false, 00:37:27.644 "nvme_iov_md": false 00:37:27.644 }, 00:37:27.644 "driver_specific": { 00:37:27.644 "lvol": { 00:37:27.644 "lvol_store_uuid": "7b53a65f-c3a6-45ad-8e36-08f0ae097b29", 00:37:27.644 "base_bdev": "Nvme0n1", 00:37:27.644 "thin_provision": true, 00:37:27.644 "num_allocated_clusters": 0, 00:37:27.644 "snapshot": false, 00:37:27.644 "clone": false, 00:37:27.644 "esnap_clone": false 00:37:27.644 } 00:37:27.644 } 00:37:27.644 } 00:37:27.644 ] 00:37:27.644 17:31:23 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:27.644 17:31:23 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:37:27.644 17:31:23 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:37:27.902 [2024-07-23 17:31:23.246310] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:37:27.902 COMP_lvs0/lv0 00:37:27.902 17:31:23 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:37:27.902 17:31:23 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:37:27.902 17:31:23 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:27.902 17:31:23 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:27.902 17:31:23 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:27.902 17:31:23 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:27.902 17:31:23 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:28.161 17:31:23 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:37:28.419 [ 00:37:28.419 { 00:37:28.419 "name": "COMP_lvs0/lv0", 00:37:28.419 "aliases": [ 00:37:28.419 "fa6aad12-b824-527a-a97d-64bc3a1223b3" 00:37:28.419 ], 00:37:28.419 "product_name": "compress", 00:37:28.419 "block_size": 512, 00:37:28.419 "num_blocks": 200704, 00:37:28.419 "uuid": "fa6aad12-b824-527a-a97d-64bc3a1223b3", 00:37:28.419 "assigned_rate_limits": { 00:37:28.419 "rw_ios_per_sec": 0, 00:37:28.419 "rw_mbytes_per_sec": 0, 00:37:28.419 "r_mbytes_per_sec": 0, 00:37:28.419 "w_mbytes_per_sec": 0 00:37:28.419 }, 00:37:28.419 "claimed": false, 00:37:28.419 "zoned": false, 00:37:28.419 "supported_io_types": { 00:37:28.419 "read": true, 00:37:28.419 "write": true, 00:37:28.419 "unmap": false, 00:37:28.419 "flush": false, 00:37:28.419 "reset": false, 00:37:28.419 "nvme_admin": false, 00:37:28.419 "nvme_io": false, 00:37:28.419 "nvme_io_md": false, 00:37:28.419 "write_zeroes": true, 00:37:28.419 "zcopy": false, 00:37:28.419 "get_zone_info": false, 00:37:28.419 "zone_management": false, 00:37:28.419 "zone_append": false, 00:37:28.419 "compare": false, 00:37:28.419 "compare_and_write": false, 00:37:28.419 "abort": false, 00:37:28.419 "seek_hole": false, 00:37:28.419 "seek_data": false, 00:37:28.419 "copy": false, 00:37:28.419 "nvme_iov_md": false 00:37:28.419 }, 00:37:28.419 "driver_specific": { 00:37:28.419 "compress": { 00:37:28.419 "name": "COMP_lvs0/lv0", 00:37:28.419 "base_bdev_name": "b7119641-89ea-4a65-95f7-aad99cf56f12", 00:37:28.419 "pm_path": "/tmp/pmem/54a8a6d4-e473-4771-b109-ff943d8f2516" 00:37:28.419 } 00:37:28.419 } 00:37:28.419 } 00:37:28.419 ] 00:37:28.419 17:31:23 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:28.419 17:31:23 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:37:28.678 Running I/O for 3 seconds... 00:37:31.968 00:37:31.968 Latency(us) 00:37:31.968 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:31.968 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:37:31.968 Verification LBA range: start 0x0 length 0x3100 00:37:31.968 COMP_lvs0/lv0 : 3.01 2102.05 8.21 0.00 0.00 15131.28 1154.00 13278.16 00:37:31.968 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:37:31.968 Verification LBA range: start 0x3100 length 0x3100 00:37:31.968 COMP_lvs0/lv0 : 3.01 2098.53 8.20 0.00 0.00 15114.68 1367.71 12879.25 00:37:31.968 =================================================================================================================== 00:37:31.968 Total : 4200.58 16.41 0.00 0.00 15122.99 1154.00 13278.16 00:37:31.968 0 00:37:31.968 17:31:26 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:37:31.969 17:31:26 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:37:31.969 17:31:27 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:37:32.228 17:31:27 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:37:32.228 17:31:27 compress_isal -- compress/compress.sh@78 -- # killprocess 108561 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 108561 ']' 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@952 -- # kill -0 108561 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@953 -- # uname 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 108561 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 108561' 00:37:32.228 killing process with pid 108561 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@967 -- # kill 108561 00:37:32.228 Received shutdown signal, test time was about 3.000000 seconds 00:37:32.228 00:37:32.228 Latency(us) 00:37:32.228 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:32.228 =================================================================================================================== 00:37:32.228 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:32.228 17:31:27 compress_isal -- common/autotest_common.sh@972 -- # wait 108561 00:37:35.519 17:31:30 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:37:35.519 17:31:30 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:37:35.519 17:31:30 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=110157 00:37:35.519 17:31:30 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:37:35.519 17:31:30 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:37:35.519 17:31:30 compress_isal -- compress/compress.sh@73 -- # waitforlisten 110157 00:37:35.519 17:31:30 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 110157 ']' 00:37:35.519 17:31:30 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:35.519 17:31:30 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:35.519 17:31:30 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:35.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:35.519 17:31:30 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:35.519 17:31:30 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:37:35.519 [2024-07-23 17:31:30.647336] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:37:35.519 [2024-07-23 17:31:30.647415] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid110157 ] 00:37:35.519 [2024-07-23 17:31:30.789456] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:35.519 [2024-07-23 17:31:30.863948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:37:35.519 [2024-07-23 17:31:30.863955] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:36.454 17:31:31 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:36.454 17:31:31 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:37:36.454 17:31:31 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:37:36.454 17:31:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:37:36.454 17:31:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:37:37.022 17:31:32 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:37:37.022 17:31:32 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:37:37.022 17:31:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:37.022 17:31:32 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:37.022 17:31:32 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:37.022 17:31:32 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:37.022 17:31:32 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:37.281 17:31:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:37:37.281 [ 00:37:37.281 { 00:37:37.281 "name": "Nvme0n1", 00:37:37.281 "aliases": [ 00:37:37.281 "01000000-0000-0000-5cd2-e43197705251" 00:37:37.281 ], 00:37:37.281 "product_name": "NVMe disk", 00:37:37.281 "block_size": 512, 00:37:37.281 "num_blocks": 15002931888, 00:37:37.281 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:37:37.281 "assigned_rate_limits": { 00:37:37.281 "rw_ios_per_sec": 0, 00:37:37.281 "rw_mbytes_per_sec": 0, 00:37:37.281 "r_mbytes_per_sec": 0, 00:37:37.281 "w_mbytes_per_sec": 0 00:37:37.281 }, 00:37:37.281 "claimed": false, 00:37:37.281 "zoned": false, 00:37:37.281 "supported_io_types": { 00:37:37.281 "read": true, 00:37:37.281 "write": true, 00:37:37.281 "unmap": true, 00:37:37.281 "flush": true, 00:37:37.281 "reset": true, 00:37:37.281 "nvme_admin": true, 00:37:37.281 "nvme_io": true, 00:37:37.281 "nvme_io_md": false, 00:37:37.281 "write_zeroes": true, 00:37:37.281 "zcopy": false, 00:37:37.281 "get_zone_info": false, 00:37:37.281 "zone_management": false, 00:37:37.281 "zone_append": false, 00:37:37.281 "compare": false, 00:37:37.281 "compare_and_write": false, 00:37:37.281 "abort": true, 00:37:37.281 "seek_hole": false, 00:37:37.281 "seek_data": false, 00:37:37.281 "copy": false, 00:37:37.281 "nvme_iov_md": false 00:37:37.281 }, 00:37:37.281 "driver_specific": { 00:37:37.281 "nvme": [ 00:37:37.281 { 00:37:37.281 "pci_address": "0000:5e:00.0", 00:37:37.282 "trid": { 00:37:37.282 "trtype": "PCIe", 00:37:37.282 "traddr": "0000:5e:00.0" 00:37:37.282 }, 00:37:37.282 "ctrlr_data": { 00:37:37.282 "cntlid": 0, 00:37:37.282 "vendor_id": "0x8086", 00:37:37.282 "model_number": "INTEL SSDPF2KX076TZO", 00:37:37.282 "serial_number": "PHAC0301002G7P6CGN", 00:37:37.282 "firmware_revision": "JCV10200", 00:37:37.282 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:37:37.282 "oacs": { 00:37:37.282 "security": 1, 00:37:37.282 "format": 1, 00:37:37.282 "firmware": 1, 00:37:37.282 "ns_manage": 1 00:37:37.282 }, 00:37:37.282 "multi_ctrlr": false, 00:37:37.282 "ana_reporting": false 00:37:37.282 }, 00:37:37.282 "vs": { 00:37:37.282 "nvme_version": "1.3" 00:37:37.282 }, 00:37:37.282 "ns_data": { 00:37:37.282 "id": 1, 00:37:37.282 "can_share": false 00:37:37.282 }, 00:37:37.282 "security": { 00:37:37.282 "opal": true 00:37:37.282 } 00:37:37.282 } 00:37:37.282 ], 00:37:37.282 "mp_policy": "active_passive" 00:37:37.282 } 00:37:37.282 } 00:37:37.282 ] 00:37:37.282 17:31:32 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:37.282 17:31:32 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:37:39.817 7e2f7656-293c-4235-bca5-82ee4cf177b6 00:37:39.817 17:31:35 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:37:40.078 4fbc4b88-e9ca-4468-a330-ed2fd467dc27 00:37:40.078 17:31:35 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:37:40.078 17:31:35 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:37:40.078 17:31:35 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:40.078 17:31:35 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:40.078 17:31:35 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:40.078 17:31:35 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:40.078 17:31:35 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:40.380 17:31:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:37:40.649 [ 00:37:40.649 { 00:37:40.649 "name": "4fbc4b88-e9ca-4468-a330-ed2fd467dc27", 00:37:40.649 "aliases": [ 00:37:40.649 "lvs0/lv0" 00:37:40.649 ], 00:37:40.649 "product_name": "Logical Volume", 00:37:40.649 "block_size": 512, 00:37:40.649 "num_blocks": 204800, 00:37:40.649 "uuid": "4fbc4b88-e9ca-4468-a330-ed2fd467dc27", 00:37:40.649 "assigned_rate_limits": { 00:37:40.649 "rw_ios_per_sec": 0, 00:37:40.649 "rw_mbytes_per_sec": 0, 00:37:40.649 "r_mbytes_per_sec": 0, 00:37:40.649 "w_mbytes_per_sec": 0 00:37:40.649 }, 00:37:40.649 "claimed": false, 00:37:40.649 "zoned": false, 00:37:40.649 "supported_io_types": { 00:37:40.649 "read": true, 00:37:40.649 "write": true, 00:37:40.649 "unmap": true, 00:37:40.649 "flush": false, 00:37:40.649 "reset": true, 00:37:40.649 "nvme_admin": false, 00:37:40.649 "nvme_io": false, 00:37:40.649 "nvme_io_md": false, 00:37:40.649 "write_zeroes": true, 00:37:40.649 "zcopy": false, 00:37:40.649 "get_zone_info": false, 00:37:40.649 "zone_management": false, 00:37:40.649 "zone_append": false, 00:37:40.649 "compare": false, 00:37:40.649 "compare_and_write": false, 00:37:40.649 "abort": false, 00:37:40.649 "seek_hole": true, 00:37:40.649 "seek_data": true, 00:37:40.649 "copy": false, 00:37:40.649 "nvme_iov_md": false 00:37:40.649 }, 00:37:40.649 "driver_specific": { 00:37:40.649 "lvol": { 00:37:40.649 "lvol_store_uuid": "7e2f7656-293c-4235-bca5-82ee4cf177b6", 00:37:40.649 "base_bdev": "Nvme0n1", 00:37:40.649 "thin_provision": true, 00:37:40.649 "num_allocated_clusters": 0, 00:37:40.649 "snapshot": false, 00:37:40.649 "clone": false, 00:37:40.649 "esnap_clone": false 00:37:40.649 } 00:37:40.649 } 00:37:40.649 } 00:37:40.649 ] 00:37:40.649 17:31:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:40.649 17:31:35 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:37:40.649 17:31:35 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:37:40.908 [2024-07-23 17:31:36.121845] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:37:40.908 COMP_lvs0/lv0 00:37:40.908 17:31:36 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:37:40.908 17:31:36 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:37:40.908 17:31:36 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:40.908 17:31:36 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:40.908 17:31:36 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:40.908 17:31:36 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:40.908 17:31:36 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:41.167 17:31:36 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:37:41.736 [ 00:37:41.736 { 00:37:41.736 "name": "COMP_lvs0/lv0", 00:37:41.736 "aliases": [ 00:37:41.736 "4c90e8f4-dfa9-5632-89b1-0a0f5ff340c0" 00:37:41.736 ], 00:37:41.736 "product_name": "compress", 00:37:41.736 "block_size": 4096, 00:37:41.736 "num_blocks": 25088, 00:37:41.736 "uuid": "4c90e8f4-dfa9-5632-89b1-0a0f5ff340c0", 00:37:41.736 "assigned_rate_limits": { 00:37:41.736 "rw_ios_per_sec": 0, 00:37:41.736 "rw_mbytes_per_sec": 0, 00:37:41.736 "r_mbytes_per_sec": 0, 00:37:41.736 "w_mbytes_per_sec": 0 00:37:41.736 }, 00:37:41.736 "claimed": false, 00:37:41.736 "zoned": false, 00:37:41.736 "supported_io_types": { 00:37:41.736 "read": true, 00:37:41.736 "write": true, 00:37:41.736 "unmap": false, 00:37:41.736 "flush": false, 00:37:41.736 "reset": false, 00:37:41.736 "nvme_admin": false, 00:37:41.736 "nvme_io": false, 00:37:41.736 "nvme_io_md": false, 00:37:41.736 "write_zeroes": true, 00:37:41.736 "zcopy": false, 00:37:41.736 "get_zone_info": false, 00:37:41.736 "zone_management": false, 00:37:41.736 "zone_append": false, 00:37:41.736 "compare": false, 00:37:41.736 "compare_and_write": false, 00:37:41.736 "abort": false, 00:37:41.736 "seek_hole": false, 00:37:41.736 "seek_data": false, 00:37:41.736 "copy": false, 00:37:41.736 "nvme_iov_md": false 00:37:41.736 }, 00:37:41.736 "driver_specific": { 00:37:41.736 "compress": { 00:37:41.736 "name": "COMP_lvs0/lv0", 00:37:41.736 "base_bdev_name": "4fbc4b88-e9ca-4468-a330-ed2fd467dc27", 00:37:41.736 "pm_path": "/tmp/pmem/31fdc014-91f7-4dfb-bdc8-00e7ce6109a9" 00:37:41.736 } 00:37:41.736 } 00:37:41.736 } 00:37:41.736 ] 00:37:41.736 17:31:36 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:41.736 17:31:36 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:37:41.736 Running I/O for 3 seconds... 00:37:45.027 00:37:45.027 Latency(us) 00:37:45.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:45.027 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:37:45.027 Verification LBA range: start 0x0 length 0x3100 00:37:45.027 COMP_lvs0/lv0 : 3.01 1268.11 4.95 0.00 0.00 25125.69 2464.72 21997.30 00:37:45.027 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:37:45.027 Verification LBA range: start 0x3100 length 0x3100 00:37:45.027 COMP_lvs0/lv0 : 3.01 1270.31 4.96 0.00 0.00 25053.43 1389.08 20629.59 00:37:45.027 =================================================================================================================== 00:37:45.027 Total : 2538.42 9.92 0.00 0.00 25089.53 1389.08 21997.30 00:37:45.027 0 00:37:45.027 17:31:40 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:37:45.027 17:31:40 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:37:45.027 17:31:40 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:37:45.287 17:31:40 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:37:45.287 17:31:40 compress_isal -- compress/compress.sh@78 -- # killprocess 110157 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 110157 ']' 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@952 -- # kill -0 110157 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@953 -- # uname 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 110157 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 110157' 00:37:45.287 killing process with pid 110157 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@967 -- # kill 110157 00:37:45.287 Received shutdown signal, test time was about 3.000000 seconds 00:37:45.287 00:37:45.287 Latency(us) 00:37:45.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:45.287 =================================================================================================================== 00:37:45.287 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:45.287 17:31:40 compress_isal -- common/autotest_common.sh@972 -- # wait 110157 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=111880 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@57 -- # waitforlisten 111880 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 111880 ']' 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:48.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:37:48.576 [2024-07-23 17:31:43.660581] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:37:48.576 [2024-07-23 17:31:43.660657] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid111880 ] 00:37:48.576 [2024-07-23 17:31:43.796266] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:37:48.576 [2024-07-23 17:31:43.852701] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:48.576 [2024-07-23 17:31:43.852736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:37:48.576 [2024-07-23 17:31:43.852738] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:48.576 17:31:43 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@58 -- # create_vols 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:37:48.576 17:31:43 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:37:49.513 17:31:44 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:49.513 17:31:44 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:37:49.772 [ 00:37:49.772 { 00:37:49.772 "name": "Nvme0n1", 00:37:49.772 "aliases": [ 00:37:49.772 "01000000-0000-0000-5cd2-e43197705251" 00:37:49.772 ], 00:37:49.772 "product_name": "NVMe disk", 00:37:49.772 "block_size": 512, 00:37:49.772 "num_blocks": 15002931888, 00:37:49.772 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:37:49.772 "assigned_rate_limits": { 00:37:49.772 "rw_ios_per_sec": 0, 00:37:49.772 "rw_mbytes_per_sec": 0, 00:37:49.772 "r_mbytes_per_sec": 0, 00:37:49.772 "w_mbytes_per_sec": 0 00:37:49.772 }, 00:37:49.772 "claimed": false, 00:37:49.772 "zoned": false, 00:37:49.772 "supported_io_types": { 00:37:49.772 "read": true, 00:37:49.772 "write": true, 00:37:49.772 "unmap": true, 00:37:49.772 "flush": true, 00:37:49.772 "reset": true, 00:37:49.772 "nvme_admin": true, 00:37:49.772 "nvme_io": true, 00:37:49.772 "nvme_io_md": false, 00:37:49.772 "write_zeroes": true, 00:37:49.772 "zcopy": false, 00:37:49.772 "get_zone_info": false, 00:37:49.772 "zone_management": false, 00:37:49.772 "zone_append": false, 00:37:49.772 "compare": false, 00:37:49.772 "compare_and_write": false, 00:37:49.772 "abort": true, 00:37:49.772 "seek_hole": false, 00:37:49.772 "seek_data": false, 00:37:49.772 "copy": false, 00:37:49.772 "nvme_iov_md": false 00:37:49.772 }, 00:37:49.772 "driver_specific": { 00:37:49.772 "nvme": [ 00:37:49.772 { 00:37:49.772 "pci_address": "0000:5e:00.0", 00:37:49.772 "trid": { 00:37:49.772 "trtype": "PCIe", 00:37:49.772 "traddr": "0000:5e:00.0" 00:37:49.772 }, 00:37:49.772 "ctrlr_data": { 00:37:49.772 "cntlid": 0, 00:37:49.772 "vendor_id": "0x8086", 00:37:49.772 "model_number": "INTEL SSDPF2KX076TZO", 00:37:49.772 "serial_number": "PHAC0301002G7P6CGN", 00:37:49.772 "firmware_revision": "JCV10200", 00:37:49.772 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:37:49.772 "oacs": { 00:37:49.772 "security": 1, 00:37:49.772 "format": 1, 00:37:49.772 "firmware": 1, 00:37:49.772 "ns_manage": 1 00:37:49.772 }, 00:37:49.772 "multi_ctrlr": false, 00:37:49.772 "ana_reporting": false 00:37:49.772 }, 00:37:49.772 "vs": { 00:37:49.772 "nvme_version": "1.3" 00:37:49.772 }, 00:37:49.772 "ns_data": { 00:37:49.772 "id": 1, 00:37:49.772 "can_share": false 00:37:49.772 }, 00:37:49.772 "security": { 00:37:49.772 "opal": true 00:37:49.772 } 00:37:49.772 } 00:37:49.772 ], 00:37:49.772 "mp_policy": "active_passive" 00:37:49.772 } 00:37:49.772 } 00:37:49.772 ] 00:37:49.772 17:31:45 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:49.772 17:31:45 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:37:52.305 ca870edf-0f2a-45f3-b24e-ea0f8dba6662 00:37:52.305 17:31:47 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:37:52.564 3caddf27-2f9d-4f9c-83a0-c7e9a897488f 00:37:52.564 17:31:47 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:37:52.564 17:31:47 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:37:52.564 17:31:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:52.564 17:31:47 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:52.564 17:31:47 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:52.564 17:31:47 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:52.564 17:31:47 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:52.823 17:31:48 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:37:53.082 [ 00:37:53.082 { 00:37:53.082 "name": "3caddf27-2f9d-4f9c-83a0-c7e9a897488f", 00:37:53.082 "aliases": [ 00:37:53.082 "lvs0/lv0" 00:37:53.082 ], 00:37:53.082 "product_name": "Logical Volume", 00:37:53.082 "block_size": 512, 00:37:53.082 "num_blocks": 204800, 00:37:53.082 "uuid": "3caddf27-2f9d-4f9c-83a0-c7e9a897488f", 00:37:53.082 "assigned_rate_limits": { 00:37:53.082 "rw_ios_per_sec": 0, 00:37:53.082 "rw_mbytes_per_sec": 0, 00:37:53.082 "r_mbytes_per_sec": 0, 00:37:53.082 "w_mbytes_per_sec": 0 00:37:53.082 }, 00:37:53.082 "claimed": false, 00:37:53.082 "zoned": false, 00:37:53.082 "supported_io_types": { 00:37:53.082 "read": true, 00:37:53.082 "write": true, 00:37:53.082 "unmap": true, 00:37:53.082 "flush": false, 00:37:53.082 "reset": true, 00:37:53.082 "nvme_admin": false, 00:37:53.082 "nvme_io": false, 00:37:53.082 "nvme_io_md": false, 00:37:53.082 "write_zeroes": true, 00:37:53.082 "zcopy": false, 00:37:53.082 "get_zone_info": false, 00:37:53.082 "zone_management": false, 00:37:53.082 "zone_append": false, 00:37:53.082 "compare": false, 00:37:53.082 "compare_and_write": false, 00:37:53.082 "abort": false, 00:37:53.082 "seek_hole": true, 00:37:53.082 "seek_data": true, 00:37:53.082 "copy": false, 00:37:53.082 "nvme_iov_md": false 00:37:53.082 }, 00:37:53.082 "driver_specific": { 00:37:53.082 "lvol": { 00:37:53.082 "lvol_store_uuid": "ca870edf-0f2a-45f3-b24e-ea0f8dba6662", 00:37:53.082 "base_bdev": "Nvme0n1", 00:37:53.082 "thin_provision": true, 00:37:53.082 "num_allocated_clusters": 0, 00:37:53.082 "snapshot": false, 00:37:53.082 "clone": false, 00:37:53.082 "esnap_clone": false 00:37:53.082 } 00:37:53.082 } 00:37:53.082 } 00:37:53.082 ] 00:37:53.082 17:31:48 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:53.082 17:31:48 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:37:53.082 17:31:48 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:37:53.341 [2024-07-23 17:31:48.548451] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:37:53.341 COMP_lvs0/lv0 00:37:53.341 17:31:48 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:37:53.341 17:31:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:37:53.341 17:31:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:53.341 17:31:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:53.341 17:31:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:53.341 17:31:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:53.341 17:31:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:53.600 17:31:48 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:37:53.600 [ 00:37:53.600 { 00:37:53.600 "name": "COMP_lvs0/lv0", 00:37:53.600 "aliases": [ 00:37:53.600 "7d465943-a123-51ca-872e-fb19e89f7c05" 00:37:53.600 ], 00:37:53.600 "product_name": "compress", 00:37:53.600 "block_size": 512, 00:37:53.600 "num_blocks": 200704, 00:37:53.600 "uuid": "7d465943-a123-51ca-872e-fb19e89f7c05", 00:37:53.600 "assigned_rate_limits": { 00:37:53.600 "rw_ios_per_sec": 0, 00:37:53.600 "rw_mbytes_per_sec": 0, 00:37:53.600 "r_mbytes_per_sec": 0, 00:37:53.600 "w_mbytes_per_sec": 0 00:37:53.600 }, 00:37:53.600 "claimed": false, 00:37:53.600 "zoned": false, 00:37:53.600 "supported_io_types": { 00:37:53.600 "read": true, 00:37:53.600 "write": true, 00:37:53.600 "unmap": false, 00:37:53.600 "flush": false, 00:37:53.600 "reset": false, 00:37:53.600 "nvme_admin": false, 00:37:53.600 "nvme_io": false, 00:37:53.600 "nvme_io_md": false, 00:37:53.600 "write_zeroes": true, 00:37:53.600 "zcopy": false, 00:37:53.600 "get_zone_info": false, 00:37:53.600 "zone_management": false, 00:37:53.600 "zone_append": false, 00:37:53.600 "compare": false, 00:37:53.600 "compare_and_write": false, 00:37:53.600 "abort": false, 00:37:53.600 "seek_hole": false, 00:37:53.600 "seek_data": false, 00:37:53.600 "copy": false, 00:37:53.600 "nvme_iov_md": false 00:37:53.600 }, 00:37:53.600 "driver_specific": { 00:37:53.600 "compress": { 00:37:53.600 "name": "COMP_lvs0/lv0", 00:37:53.600 "base_bdev_name": "3caddf27-2f9d-4f9c-83a0-c7e9a897488f", 00:37:53.600 "pm_path": "/tmp/pmem/3a9e1d80-9b7a-4e24-b7d3-199309221df8" 00:37:53.600 } 00:37:53.600 } 00:37:53.600 } 00:37:53.600 ] 00:37:53.859 17:31:49 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:53.860 17:31:49 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:37:53.860 I/O targets: 00:37:53.860 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:37:53.860 00:37:53.860 00:37:53.860 CUnit - A unit testing framework for C - Version 2.1-3 00:37:53.860 http://cunit.sourceforge.net/ 00:37:53.860 00:37:53.860 00:37:53.860 Suite: bdevio tests on: COMP_lvs0/lv0 00:37:53.860 Test: blockdev write read block ...passed 00:37:53.860 Test: blockdev write zeroes read block ...passed 00:37:53.860 Test: blockdev write zeroes read no split ...passed 00:37:53.860 Test: blockdev write zeroes read split ...passed 00:37:53.860 Test: blockdev write zeroes read split partial ...passed 00:37:53.860 Test: blockdev reset ...[2024-07-23 17:31:49.269936] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:37:53.860 passed 00:37:53.860 Test: blockdev write read 8 blocks ...passed 00:37:53.860 Test: blockdev write read size > 128k ...passed 00:37:53.860 Test: blockdev write read invalid size ...passed 00:37:53.860 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:37:53.860 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:37:53.860 Test: blockdev write read max offset ...passed 00:37:53.860 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:37:53.860 Test: blockdev writev readv 8 blocks ...passed 00:37:53.860 Test: blockdev writev readv 30 x 1block ...passed 00:37:53.860 Test: blockdev writev readv block ...passed 00:37:53.860 Test: blockdev writev readv size > 128k ...passed 00:37:53.860 Test: blockdev writev readv size > 128k in two iovs ...passed 00:37:54.119 Test: blockdev comparev and writev ...passed 00:37:54.119 Test: blockdev nvme passthru rw ...passed 00:37:54.119 Test: blockdev nvme passthru vendor specific ...passed 00:37:54.119 Test: blockdev nvme admin passthru ...passed 00:37:54.119 Test: blockdev copy ...passed 00:37:54.119 00:37:54.119 Run Summary: Type Total Ran Passed Failed Inactive 00:37:54.119 suites 1 1 n/a 0 0 00:37:54.119 tests 23 23 23 0 0 00:37:54.119 asserts 130 130 130 0 n/a 00:37:54.119 00:37:54.119 Elapsed time = 0.294 seconds 00:37:54.119 0 00:37:54.119 17:31:49 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:37:54.119 17:31:49 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:37:54.119 17:31:49 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:37:54.378 17:31:49 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:37:54.378 17:31:49 compress_isal -- compress/compress.sh@62 -- # killprocess 111880 00:37:54.378 17:31:49 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 111880 ']' 00:37:54.378 17:31:49 compress_isal -- common/autotest_common.sh@952 -- # kill -0 111880 00:37:54.378 17:31:49 compress_isal -- common/autotest_common.sh@953 -- # uname 00:37:54.378 17:31:49 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:54.378 17:31:49 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 111880 00:37:54.637 17:31:49 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:54.637 17:31:49 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:54.637 17:31:49 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 111880' 00:37:54.637 killing process with pid 111880 00:37:54.637 17:31:49 compress_isal -- common/autotest_common.sh@967 -- # kill 111880 00:37:54.638 17:31:49 compress_isal -- common/autotest_common.sh@972 -- # wait 111880 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@91 -- # '[' 1 -eq 1 ']' 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@92 -- # run_bdevperf 64 16384 30 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=113009 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 64 -o 16384 -w verify -t 30 -C -m 0x6 00:37:57.926 17:31:52 compress_isal -- compress/compress.sh@73 -- # waitforlisten 113009 00:37:57.926 17:31:52 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 113009 ']' 00:37:57.926 17:31:52 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:57.926 17:31:52 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:57.926 17:31:52 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:57.926 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:57.926 17:31:52 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:57.926 17:31:52 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:37:57.926 [2024-07-23 17:31:52.831947] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:37:57.926 [2024-07-23 17:31:52.832019] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid113009 ] 00:37:57.926 [2024-07-23 17:31:52.968933] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:57.926 [2024-07-23 17:31:53.026096] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:37:57.926 [2024-07-23 17:31:53.026103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:58.494 17:31:53 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:58.494 17:31:53 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:37:58.494 17:31:53 compress_isal -- compress/compress.sh@74 -- # create_vols 00:37:58.494 17:31:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:37:58.494 17:31:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:37:59.063 17:31:54 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:37:59.063 17:31:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:37:59.063 17:31:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:37:59.063 17:31:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:37:59.063 17:31:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:37:59.063 17:31:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:37:59.063 17:31:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:37:59.322 17:31:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:37:59.582 [ 00:37:59.582 { 00:37:59.582 "name": "Nvme0n1", 00:37:59.582 "aliases": [ 00:37:59.582 "01000000-0000-0000-5cd2-e43197705251" 00:37:59.582 ], 00:37:59.582 "product_name": "NVMe disk", 00:37:59.582 "block_size": 512, 00:37:59.582 "num_blocks": 15002931888, 00:37:59.582 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:37:59.582 "assigned_rate_limits": { 00:37:59.582 "rw_ios_per_sec": 0, 00:37:59.582 "rw_mbytes_per_sec": 0, 00:37:59.582 "r_mbytes_per_sec": 0, 00:37:59.582 "w_mbytes_per_sec": 0 00:37:59.582 }, 00:37:59.582 "claimed": false, 00:37:59.582 "zoned": false, 00:37:59.582 "supported_io_types": { 00:37:59.582 "read": true, 00:37:59.582 "write": true, 00:37:59.582 "unmap": true, 00:37:59.582 "flush": true, 00:37:59.582 "reset": true, 00:37:59.582 "nvme_admin": true, 00:37:59.582 "nvme_io": true, 00:37:59.582 "nvme_io_md": false, 00:37:59.582 "write_zeroes": true, 00:37:59.582 "zcopy": false, 00:37:59.582 "get_zone_info": false, 00:37:59.582 "zone_management": false, 00:37:59.582 "zone_append": false, 00:37:59.582 "compare": false, 00:37:59.582 "compare_and_write": false, 00:37:59.582 "abort": true, 00:37:59.582 "seek_hole": false, 00:37:59.582 "seek_data": false, 00:37:59.582 "copy": false, 00:37:59.582 "nvme_iov_md": false 00:37:59.582 }, 00:37:59.582 "driver_specific": { 00:37:59.582 "nvme": [ 00:37:59.582 { 00:37:59.582 "pci_address": "0000:5e:00.0", 00:37:59.582 "trid": { 00:37:59.582 "trtype": "PCIe", 00:37:59.582 "traddr": "0000:5e:00.0" 00:37:59.582 }, 00:37:59.582 "ctrlr_data": { 00:37:59.582 "cntlid": 0, 00:37:59.582 "vendor_id": "0x8086", 00:37:59.582 "model_number": "INTEL SSDPF2KX076TZO", 00:37:59.582 "serial_number": "PHAC0301002G7P6CGN", 00:37:59.582 "firmware_revision": "JCV10200", 00:37:59.582 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:37:59.582 "oacs": { 00:37:59.582 "security": 1, 00:37:59.582 "format": 1, 00:37:59.582 "firmware": 1, 00:37:59.582 "ns_manage": 1 00:37:59.582 }, 00:37:59.582 "multi_ctrlr": false, 00:37:59.582 "ana_reporting": false 00:37:59.582 }, 00:37:59.582 "vs": { 00:37:59.582 "nvme_version": "1.3" 00:37:59.582 }, 00:37:59.582 "ns_data": { 00:37:59.582 "id": 1, 00:37:59.582 "can_share": false 00:37:59.582 }, 00:37:59.582 "security": { 00:37:59.582 "opal": true 00:37:59.582 } 00:37:59.582 } 00:37:59.582 ], 00:37:59.582 "mp_policy": "active_passive" 00:37:59.582 } 00:37:59.582 } 00:37:59.582 ] 00:37:59.582 17:31:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:37:59.582 17:31:54 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:38:02.117 eed5ba91-24dc-4ecb-b25a-70f7a0ce94f8 00:38:02.117 17:31:57 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:38:02.117 8c68d7a4-593f-4ae6-8086-c85f4b2c7eb2 00:38:02.376 17:31:57 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@899 -- # local i 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:38:02.376 17:31:57 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:38:02.635 [ 00:38:02.635 { 00:38:02.635 "name": "8c68d7a4-593f-4ae6-8086-c85f4b2c7eb2", 00:38:02.635 "aliases": [ 00:38:02.635 "lvs0/lv0" 00:38:02.635 ], 00:38:02.635 "product_name": "Logical Volume", 00:38:02.635 "block_size": 512, 00:38:02.635 "num_blocks": 204800, 00:38:02.635 "uuid": "8c68d7a4-593f-4ae6-8086-c85f4b2c7eb2", 00:38:02.635 "assigned_rate_limits": { 00:38:02.635 "rw_ios_per_sec": 0, 00:38:02.635 "rw_mbytes_per_sec": 0, 00:38:02.635 "r_mbytes_per_sec": 0, 00:38:02.635 "w_mbytes_per_sec": 0 00:38:02.635 }, 00:38:02.635 "claimed": false, 00:38:02.635 "zoned": false, 00:38:02.635 "supported_io_types": { 00:38:02.635 "read": true, 00:38:02.635 "write": true, 00:38:02.635 "unmap": true, 00:38:02.635 "flush": false, 00:38:02.635 "reset": true, 00:38:02.635 "nvme_admin": false, 00:38:02.635 "nvme_io": false, 00:38:02.635 "nvme_io_md": false, 00:38:02.635 "write_zeroes": true, 00:38:02.635 "zcopy": false, 00:38:02.635 "get_zone_info": false, 00:38:02.635 "zone_management": false, 00:38:02.635 "zone_append": false, 00:38:02.635 "compare": false, 00:38:02.635 "compare_and_write": false, 00:38:02.635 "abort": false, 00:38:02.635 "seek_hole": true, 00:38:02.635 "seek_data": true, 00:38:02.635 "copy": false, 00:38:02.635 "nvme_iov_md": false 00:38:02.635 }, 00:38:02.635 "driver_specific": { 00:38:02.635 "lvol": { 00:38:02.635 "lvol_store_uuid": "eed5ba91-24dc-4ecb-b25a-70f7a0ce94f8", 00:38:02.635 "base_bdev": "Nvme0n1", 00:38:02.635 "thin_provision": true, 00:38:02.635 "num_allocated_clusters": 0, 00:38:02.635 "snapshot": false, 00:38:02.635 "clone": false, 00:38:02.635 "esnap_clone": false 00:38:02.636 } 00:38:02.636 } 00:38:02.636 } 00:38:02.636 ] 00:38:02.636 17:31:58 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:38:02.636 17:31:58 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:38:02.636 17:31:58 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:38:02.894 [2024-07-23 17:31:58.230509] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:38:02.894 COMP_lvs0/lv0 00:38:02.894 17:31:58 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:38:02.894 17:31:58 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:38:02.894 17:31:58 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:38:02.894 17:31:58 compress_isal -- common/autotest_common.sh@899 -- # local i 00:38:02.894 17:31:58 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:38:02.894 17:31:58 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:38:02.894 17:31:58 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:38:03.154 17:31:58 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:38:03.422 [ 00:38:03.422 { 00:38:03.422 "name": "COMP_lvs0/lv0", 00:38:03.422 "aliases": [ 00:38:03.422 "4dc82703-792e-5a83-869b-f5db6276edea" 00:38:03.422 ], 00:38:03.422 "product_name": "compress", 00:38:03.422 "block_size": 512, 00:38:03.422 "num_blocks": 200704, 00:38:03.422 "uuid": "4dc82703-792e-5a83-869b-f5db6276edea", 00:38:03.422 "assigned_rate_limits": { 00:38:03.422 "rw_ios_per_sec": 0, 00:38:03.422 "rw_mbytes_per_sec": 0, 00:38:03.422 "r_mbytes_per_sec": 0, 00:38:03.422 "w_mbytes_per_sec": 0 00:38:03.422 }, 00:38:03.422 "claimed": false, 00:38:03.422 "zoned": false, 00:38:03.422 "supported_io_types": { 00:38:03.422 "read": true, 00:38:03.422 "write": true, 00:38:03.422 "unmap": false, 00:38:03.422 "flush": false, 00:38:03.422 "reset": false, 00:38:03.422 "nvme_admin": false, 00:38:03.422 "nvme_io": false, 00:38:03.422 "nvme_io_md": false, 00:38:03.422 "write_zeroes": true, 00:38:03.422 "zcopy": false, 00:38:03.422 "get_zone_info": false, 00:38:03.422 "zone_management": false, 00:38:03.422 "zone_append": false, 00:38:03.422 "compare": false, 00:38:03.422 "compare_and_write": false, 00:38:03.422 "abort": false, 00:38:03.422 "seek_hole": false, 00:38:03.422 "seek_data": false, 00:38:03.422 "copy": false, 00:38:03.422 "nvme_iov_md": false 00:38:03.422 }, 00:38:03.422 "driver_specific": { 00:38:03.422 "compress": { 00:38:03.422 "name": "COMP_lvs0/lv0", 00:38:03.422 "base_bdev_name": "8c68d7a4-593f-4ae6-8086-c85f4b2c7eb2", 00:38:03.422 "pm_path": "/tmp/pmem/706714c8-b03d-4777-bdc7-832c6bd550ef" 00:38:03.422 } 00:38:03.422 } 00:38:03.422 } 00:38:03.422 ] 00:38:03.422 17:31:58 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:38:03.422 17:31:58 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:38:03.697 Running I/O for 30 seconds... 00:38:35.784 00:38:35.784 Latency(us) 00:38:35.784 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:35.784 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 64, IO size: 16384) 00:38:35.784 Verification LBA range: start 0x0 length 0xc40 00:38:35.784 COMP_lvs0/lv0 : 30.02 441.40 6.90 0.00 0.00 144673.04 751.53 138594.39 00:38:35.784 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 64, IO size: 16384) 00:38:35.784 Verification LBA range: start 0xc40 length 0xc40 00:38:35.784 COMP_lvs0/lv0 : 30.02 1765.16 27.58 0.00 0.00 35984.18 2037.31 85709.69 00:38:35.784 =================================================================================================================== 00:38:35.784 Total : 2206.56 34.48 0.00 0.00 57728.52 751.53 138594.39 00:38:35.784 0 00:38:35.784 17:32:29 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:38:35.784 17:32:29 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:38:35.784 17:32:29 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:38:35.784 17:32:29 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:38:35.784 17:32:29 compress_isal -- compress/compress.sh@78 -- # killprocess 113009 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 113009 ']' 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@952 -- # kill -0 113009 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@953 -- # uname 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 113009 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 113009' 00:38:35.784 killing process with pid 113009 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@967 -- # kill 113009 00:38:35.784 Received shutdown signal, test time was about 30.000000 seconds 00:38:35.784 00:38:35.784 Latency(us) 00:38:35.784 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:35.784 =================================================================================================================== 00:38:35.784 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:35.784 17:32:29 compress_isal -- common/autotest_common.sh@972 -- # wait 113009 00:38:37.164 17:32:32 compress_isal -- compress/compress.sh@95 -- # export TEST_TRANSPORT=tcp 00:38:37.164 17:32:32 compress_isal -- compress/compress.sh@95 -- # TEST_TRANSPORT=tcp 00:38:37.164 17:32:32 compress_isal -- compress/compress.sh@96 -- # NET_TYPE=virt 00:38:37.164 17:32:32 compress_isal -- compress/compress.sh@96 -- # nvmftestinit 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@448 -- # prepare_net_devs 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@410 -- # local -g is_hw=no 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@412 -- # remove_spdk_ns 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:38:37.164 17:32:32 compress_isal -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:38:37.164 17:32:32 compress_isal -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@414 -- # [[ virt != virt ]] 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@423 -- # [[ virt == phy ]] 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@426 -- # [[ virt == phy-fallback ]] 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@432 -- # nvmf_veth_init 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:38:37.164 Cannot find device "nvmf_tgt_br" 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@155 -- # true 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:38:37.164 Cannot find device "nvmf_tgt_br2" 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@156 -- # true 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:38:37.164 Cannot find device "nvmf_tgt_br" 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@158 -- # true 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:38:37.164 Cannot find device "nvmf_tgt_br2" 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@159 -- # true 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:38:37.164 17:32:32 compress_isal -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:38:37.424 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@162 -- # true 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:38:37.424 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@163 -- # true 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:38:37.424 17:32:32 compress_isal -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:38:37.683 17:32:32 compress_isal -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:38:37.683 17:32:32 compress_isal -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:38:37.683 17:32:32 compress_isal -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:38:37.683 17:32:32 compress_isal -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:38:37.683 17:32:32 compress_isal -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:38:37.683 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:38:37.683 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.103 ms 00:38:37.683 00:38:37.683 --- 10.0.0.2 ping statistics --- 00:38:37.683 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:37.683 rtt min/avg/max/mdev = 0.103/0.103/0.103/0.000 ms 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:38:37.684 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:38:37.684 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.070 ms 00:38:37.684 00:38:37.684 --- 10.0.0.3 ping statistics --- 00:38:37.684 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:37.684 rtt min/avg/max/mdev = 0.070/0.070/0.070/0.000 ms 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:38:37.684 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:38:37.684 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.057 ms 00:38:37.684 00:38:37.684 --- 10.0.0.1 ping statistics --- 00:38:37.684 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:37.684 rtt min/avg/max/mdev = 0.057/0.057/0.057/0.000 ms 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@433 -- # return 0 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:38:37.684 17:32:32 compress_isal -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:38:37.684 17:32:33 compress_isal -- compress/compress.sh@97 -- # nvmfappstart -m 0x7 00:38:37.684 17:32:33 compress_isal -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@722 -- # xtrace_disable 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:38:37.684 17:32:33 compress_isal -- nvmf/common.sh@481 -- # nvmfpid=118332 00:38:37.684 17:32:33 compress_isal -- nvmf/common.sh@482 -- # waitforlisten 118332 00:38:37.684 17:32:33 compress_isal -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 118332 ']' 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:37.684 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:37.684 17:32:33 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:38:37.943 [2024-07-23 17:32:33.117437] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:38:37.943 [2024-07-23 17:32:33.117507] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:38:37.943 [2024-07-23 17:32:33.258259] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:38:37.943 [2024-07-23 17:32:33.309081] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:38:37.943 [2024-07-23 17:32:33.309132] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:38:37.943 [2024-07-23 17:32:33.309147] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:38:37.943 [2024-07-23 17:32:33.309160] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:38:37.943 [2024-07-23 17:32:33.309171] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:38:37.943 [2024-07-23 17:32:33.309242] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:37.943 [2024-07-23 17:32:33.309741] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:38:37.943 [2024-07-23 17:32:33.309743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:38.876 17:32:34 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:38.876 17:32:34 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:38:38.876 17:32:34 compress_isal -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:38:38.876 17:32:34 compress_isal -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:38.876 17:32:34 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:38:38.876 17:32:34 compress_isal -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:38:38.876 17:32:34 compress_isal -- compress/compress.sh@98 -- # trap 'nvmftestfini; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:38:38.876 17:32:34 compress_isal -- compress/compress.sh@101 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t tcp -u 8192 00:38:39.135 [2024-07-23 17:32:34.330704] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:38:39.135 17:32:34 compress_isal -- compress/compress.sh@102 -- # create_vols 00:38:39.135 17:32:34 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:38:39.135 17:32:34 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:38:39.702 17:32:34 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:38:39.702 17:32:34 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:38:39.702 17:32:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:38:39.702 17:32:34 compress_isal -- common/autotest_common.sh@899 -- # local i 00:38:39.702 17:32:34 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:38:39.702 17:32:34 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:38:39.702 17:32:34 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:38:39.961 17:32:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:38:39.961 [ 00:38:39.961 { 00:38:39.961 "name": "Nvme0n1", 00:38:39.961 "aliases": [ 00:38:39.961 "01000000-0000-0000-5cd2-e43197705251" 00:38:39.961 ], 00:38:39.961 "product_name": "NVMe disk", 00:38:39.961 "block_size": 512, 00:38:39.961 "num_blocks": 15002931888, 00:38:39.961 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:38:39.961 "assigned_rate_limits": { 00:38:39.961 "rw_ios_per_sec": 0, 00:38:39.961 "rw_mbytes_per_sec": 0, 00:38:39.961 "r_mbytes_per_sec": 0, 00:38:39.961 "w_mbytes_per_sec": 0 00:38:39.961 }, 00:38:39.961 "claimed": false, 00:38:39.961 "zoned": false, 00:38:39.961 "supported_io_types": { 00:38:39.961 "read": true, 00:38:39.961 "write": true, 00:38:39.961 "unmap": true, 00:38:39.961 "flush": true, 00:38:39.961 "reset": true, 00:38:39.961 "nvme_admin": true, 00:38:39.961 "nvme_io": true, 00:38:39.961 "nvme_io_md": false, 00:38:39.961 "write_zeroes": true, 00:38:39.961 "zcopy": false, 00:38:39.961 "get_zone_info": false, 00:38:39.961 "zone_management": false, 00:38:39.961 "zone_append": false, 00:38:39.961 "compare": false, 00:38:39.961 "compare_and_write": false, 00:38:39.961 "abort": true, 00:38:39.961 "seek_hole": false, 00:38:39.961 "seek_data": false, 00:38:39.961 "copy": false, 00:38:39.961 "nvme_iov_md": false 00:38:39.961 }, 00:38:39.961 "driver_specific": { 00:38:39.961 "nvme": [ 00:38:39.961 { 00:38:39.961 "pci_address": "0000:5e:00.0", 00:38:39.961 "trid": { 00:38:39.961 "trtype": "PCIe", 00:38:39.961 "traddr": "0000:5e:00.0" 00:38:39.961 }, 00:38:39.961 "ctrlr_data": { 00:38:39.961 "cntlid": 0, 00:38:39.961 "vendor_id": "0x8086", 00:38:39.961 "model_number": "INTEL SSDPF2KX076TZO", 00:38:39.961 "serial_number": "PHAC0301002G7P6CGN", 00:38:39.961 "firmware_revision": "JCV10200", 00:38:39.961 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:38:39.961 "oacs": { 00:38:39.961 "security": 1, 00:38:39.961 "format": 1, 00:38:39.961 "firmware": 1, 00:38:39.961 "ns_manage": 1 00:38:39.961 }, 00:38:39.961 "multi_ctrlr": false, 00:38:39.961 "ana_reporting": false 00:38:39.961 }, 00:38:39.961 "vs": { 00:38:39.961 "nvme_version": "1.3" 00:38:39.961 }, 00:38:39.961 "ns_data": { 00:38:39.961 "id": 1, 00:38:39.961 "can_share": false 00:38:39.961 }, 00:38:39.961 "security": { 00:38:39.961 "opal": true 00:38:39.961 } 00:38:39.961 } 00:38:39.961 ], 00:38:39.961 "mp_policy": "active_passive" 00:38:39.961 } 00:38:39.961 } 00:38:39.961 ] 00:38:39.961 17:32:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:38:39.961 17:32:35 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:38:42.495 2880fd3a-da4b-4b22-9660-dec02eac9498 00:38:42.495 17:32:37 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:38:42.754 3afd4984-7a9e-4466-bf1f-22d440329eac 00:38:42.754 17:32:37 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:38:42.754 17:32:37 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:38:42.754 17:32:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:38:42.754 17:32:37 compress_isal -- common/autotest_common.sh@899 -- # local i 00:38:42.754 17:32:37 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:38:42.754 17:32:37 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:38:42.754 17:32:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:38:43.012 17:32:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:38:43.271 [ 00:38:43.271 { 00:38:43.271 "name": "3afd4984-7a9e-4466-bf1f-22d440329eac", 00:38:43.271 "aliases": [ 00:38:43.271 "lvs0/lv0" 00:38:43.271 ], 00:38:43.271 "product_name": "Logical Volume", 00:38:43.271 "block_size": 512, 00:38:43.271 "num_blocks": 204800, 00:38:43.271 "uuid": "3afd4984-7a9e-4466-bf1f-22d440329eac", 00:38:43.271 "assigned_rate_limits": { 00:38:43.271 "rw_ios_per_sec": 0, 00:38:43.271 "rw_mbytes_per_sec": 0, 00:38:43.271 "r_mbytes_per_sec": 0, 00:38:43.271 "w_mbytes_per_sec": 0 00:38:43.271 }, 00:38:43.271 "claimed": false, 00:38:43.271 "zoned": false, 00:38:43.271 "supported_io_types": { 00:38:43.271 "read": true, 00:38:43.271 "write": true, 00:38:43.271 "unmap": true, 00:38:43.271 "flush": false, 00:38:43.271 "reset": true, 00:38:43.271 "nvme_admin": false, 00:38:43.271 "nvme_io": false, 00:38:43.271 "nvme_io_md": false, 00:38:43.271 "write_zeroes": true, 00:38:43.271 "zcopy": false, 00:38:43.271 "get_zone_info": false, 00:38:43.271 "zone_management": false, 00:38:43.271 "zone_append": false, 00:38:43.271 "compare": false, 00:38:43.271 "compare_and_write": false, 00:38:43.271 "abort": false, 00:38:43.271 "seek_hole": true, 00:38:43.271 "seek_data": true, 00:38:43.271 "copy": false, 00:38:43.271 "nvme_iov_md": false 00:38:43.271 }, 00:38:43.271 "driver_specific": { 00:38:43.271 "lvol": { 00:38:43.271 "lvol_store_uuid": "2880fd3a-da4b-4b22-9660-dec02eac9498", 00:38:43.271 "base_bdev": "Nvme0n1", 00:38:43.271 "thin_provision": true, 00:38:43.271 "num_allocated_clusters": 0, 00:38:43.271 "snapshot": false, 00:38:43.271 "clone": false, 00:38:43.271 "esnap_clone": false 00:38:43.271 } 00:38:43.271 } 00:38:43.271 } 00:38:43.271 ] 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:38:43.271 17:32:38 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:38:43.271 17:32:38 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:38:43.271 [2024-07-23 17:32:38.654451] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:38:43.271 COMP_lvs0/lv0 00:38:43.271 17:32:38 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@899 -- # local i 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:38:43.271 17:32:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:38:43.530 17:32:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:38:43.788 [ 00:38:43.788 { 00:38:43.788 "name": "COMP_lvs0/lv0", 00:38:43.788 "aliases": [ 00:38:43.788 "58dceb39-b7da-51c4-abbf-55fd31bc7375" 00:38:43.788 ], 00:38:43.788 "product_name": "compress", 00:38:43.788 "block_size": 512, 00:38:43.788 "num_blocks": 200704, 00:38:43.788 "uuid": "58dceb39-b7da-51c4-abbf-55fd31bc7375", 00:38:43.788 "assigned_rate_limits": { 00:38:43.788 "rw_ios_per_sec": 0, 00:38:43.788 "rw_mbytes_per_sec": 0, 00:38:43.788 "r_mbytes_per_sec": 0, 00:38:43.788 "w_mbytes_per_sec": 0 00:38:43.788 }, 00:38:43.788 "claimed": false, 00:38:43.788 "zoned": false, 00:38:43.788 "supported_io_types": { 00:38:43.788 "read": true, 00:38:43.788 "write": true, 00:38:43.788 "unmap": false, 00:38:43.788 "flush": false, 00:38:43.788 "reset": false, 00:38:43.788 "nvme_admin": false, 00:38:43.788 "nvme_io": false, 00:38:43.788 "nvme_io_md": false, 00:38:43.788 "write_zeroes": true, 00:38:43.788 "zcopy": false, 00:38:43.788 "get_zone_info": false, 00:38:43.788 "zone_management": false, 00:38:43.788 "zone_append": false, 00:38:43.788 "compare": false, 00:38:43.788 "compare_and_write": false, 00:38:43.788 "abort": false, 00:38:43.788 "seek_hole": false, 00:38:43.788 "seek_data": false, 00:38:43.788 "copy": false, 00:38:43.788 "nvme_iov_md": false 00:38:43.788 }, 00:38:43.788 "driver_specific": { 00:38:43.788 "compress": { 00:38:43.788 "name": "COMP_lvs0/lv0", 00:38:43.788 "base_bdev_name": "3afd4984-7a9e-4466-bf1f-22d440329eac", 00:38:43.788 "pm_path": "/tmp/pmem/a4f98371-2096-4ef2-a911-30f8fb63588c" 00:38:43.788 } 00:38:43.788 } 00:38:43.788 } 00:38:43.788 ] 00:38:43.788 17:32:39 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:38:43.788 17:32:39 compress_isal -- compress/compress.sh@103 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:38:44.046 17:32:39 compress_isal -- compress/compress.sh@104 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 COMP_lvs0/lv0 00:38:44.304 17:32:39 compress_isal -- compress/compress.sh@105 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t tcp -a 10.0.0.2 -s 4420 00:38:44.563 [2024-07-23 17:32:39.739390] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:38:44.563 17:32:39 compress_isal -- compress/compress.sh@109 -- # perf_pid=119241 00:38:44.563 17:32:39 compress_isal -- compress/compress.sh@108 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:tcp adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 64 -s 512 -w randrw -t 30 -c 0x18 -M 50 00:38:44.563 17:32:39 compress_isal -- compress/compress.sh@112 -- # trap 'killprocess $perf_pid; compress_err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:38:44.563 17:32:39 compress_isal -- compress/compress.sh@113 -- # wait 119241 00:38:44.822 [2024-07-23 17:32:40.049597] subsystem.c:1572:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on TCP/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:39:16.928 Initializing NVMe Controllers 00:39:16.928 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:39:16.928 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:39:16.928 Associating TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:39:16.928 Initialization complete. Launching workers. 00:39:16.928 ======================================================== 00:39:16.928 Latency(us) 00:39:16.928 Device Information : IOPS MiB/s Average min max 00:39:16.928 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 3716.57 14.52 17222.59 2226.56 36748.37 00:39:16.928 TCP (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 2311.43 9.03 27695.71 2301.53 48993.28 00:39:16.928 ======================================================== 00:39:16.928 Total : 6028.00 23.55 21238.50 2226.56 48993.28 00:39:16.928 00:39:16.928 17:33:10 compress_isal -- compress/compress.sh@114 -- # destroy_vols 00:39:16.928 17:33:10 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:39:16.928 17:33:10 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:39:16.928 17:33:10 compress_isal -- compress/compress.sh@116 -- # trap - SIGINT SIGTERM EXIT 00:39:16.928 17:33:10 compress_isal -- compress/compress.sh@117 -- # nvmftestfini 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@488 -- # nvmfcleanup 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@117 -- # sync 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@120 -- # set +e 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@121 -- # for i in {1..20} 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:39:16.928 rmmod nvme_tcp 00:39:16.928 rmmod nvme_fabrics 00:39:16.928 rmmod nvme_keyring 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@124 -- # set -e 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@125 -- # return 0 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@489 -- # '[' -n 118332 ']' 00:39:16.928 17:33:10 compress_isal -- nvmf/common.sh@490 -- # killprocess 118332 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 118332 ']' 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@952 -- # kill -0 118332 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@953 -- # uname 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 118332 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 118332' 00:39:16.928 killing process with pid 118332 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@967 -- # kill 118332 00:39:16.928 17:33:10 compress_isal -- common/autotest_common.sh@972 -- # wait 118332 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@278 -- # remove_spdk_ns 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:39:18.832 17:33:13 compress_isal -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:39:18.832 17:33:13 compress_isal -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:39:18.832 17:33:13 compress_isal -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:39:18.832 17:33:13 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:39:18.832 00:39:18.832 real 2m9.105s 00:39:18.832 user 6m1.125s 00:39:18.832 sys 0m20.944s 00:39:18.832 17:33:13 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:39:18.832 17:33:13 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:39:18.832 ************************************ 00:39:18.832 END TEST compress_isal 00:39:18.832 ************************************ 00:39:18.832 17:33:13 -- common/autotest_common.sh@1142 -- # return 0 00:39:18.832 17:33:13 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:39:18.832 17:33:13 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:39:18.832 17:33:13 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:39:18.832 17:33:13 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:39:18.832 17:33:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:18.832 17:33:13 -- common/autotest_common.sh@10 -- # set +x 00:39:18.832 ************************************ 00:39:18.832 START TEST blockdev_crypto_aesni 00:39:18.832 ************************************ 00:39:18.832 17:33:13 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:39:18.832 * Looking for test storage... 00:39:18.832 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=123992 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 123992 00:39:18.832 17:33:14 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:39:18.832 17:33:14 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 123992 ']' 00:39:18.832 17:33:14 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:39:18.832 17:33:14 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:39:18.832 17:33:14 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:39:18.832 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:39:18.832 17:33:14 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:39:18.832 17:33:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:18.832 [2024-07-23 17:33:14.155305] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:39:18.832 [2024-07-23 17:33:14.155378] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid123992 ] 00:39:19.092 [2024-07-23 17:33:14.291489] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:19.092 [2024-07-23 17:33:14.346582] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:19.660 17:33:15 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:39:19.660 17:33:15 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:39:19.920 17:33:15 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:39:19.920 17:33:15 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:39:19.920 17:33:15 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:39:19.920 17:33:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:39:19.920 17:33:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:19.920 [2024-07-23 17:33:15.096948] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:39:19.920 [2024-07-23 17:33:15.104976] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:39:19.920 [2024-07-23 17:33:15.112990] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:39:19.920 [2024-07-23 17:33:15.179239] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:39:22.457 true 00:39:22.457 true 00:39:22.457 true 00:39:22.457 true 00:39:22.457 Malloc0 00:39:22.457 Malloc1 00:39:22.457 Malloc2 00:39:22.457 Malloc3 00:39:22.457 [2024-07-23 17:33:17.755871] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:39:22.457 crypto_ram 00:39:22.457 [2024-07-23 17:33:17.763887] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:39:22.457 crypto_ram2 00:39:22.457 [2024-07-23 17:33:17.771914] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:39:22.457 crypto_ram3 00:39:22.457 [2024-07-23 17:33:17.779936] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:39:22.457 crypto_ram4 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:39:22.457 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:39:22.457 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:39:22.457 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:39:22.457 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:39:22.457 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:39:22.457 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:22.716 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:39:22.716 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:39:22.716 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:39:22.716 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:39:22.716 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:39:22.716 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:22.716 17:33:17 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:39:22.716 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:39:22.716 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ae2d32c2-366c-597e-8fb7-c7a860bb7ad2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae2d32c2-366c-597e-8fb7-c7a860bb7ad2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5a63c159-c261-5b1e-b6c2-886203ec6d2e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5a63c159-c261-5b1e-b6c2-886203ec6d2e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "52e55600-9473-578a-957f-224a4a20aca4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "52e55600-9473-578a-957f-224a4a20aca4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "cd4b2ad2-4f1d-5f04-aa3d-ad4fb0592169"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cd4b2ad2-4f1d-5f04-aa3d-ad4fb0592169",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:39:22.716 17:33:17 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:39:22.716 17:33:18 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:39:22.716 17:33:18 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:39:22.716 17:33:18 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:39:22.716 17:33:18 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 123992 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 123992 ']' 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 123992 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 123992 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 123992' 00:39:22.716 killing process with pid 123992 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 123992 00:39:22.716 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 123992 00:39:23.283 17:33:18 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:39:23.283 17:33:18 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:39:23.283 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:39:23.283 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:23.283 17:33:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:23.283 ************************************ 00:39:23.283 START TEST bdev_hello_world 00:39:23.283 ************************************ 00:39:23.283 17:33:18 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:39:23.283 [2024-07-23 17:33:18.637321] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:39:23.283 [2024-07-23 17:33:18.637382] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid124560 ] 00:39:23.542 [2024-07-23 17:33:18.771665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:23.542 [2024-07-23 17:33:18.826452] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:23.542 [2024-07-23 17:33:18.847789] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:39:23.542 [2024-07-23 17:33:18.855815] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:39:23.542 [2024-07-23 17:33:18.863835] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:39:23.800 [2024-07-23 17:33:18.968973] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:39:26.330 [2024-07-23 17:33:21.367301] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:39:26.330 [2024-07-23 17:33:21.367378] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:39:26.330 [2024-07-23 17:33:21.367393] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:26.330 [2024-07-23 17:33:21.375317] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:39:26.330 [2024-07-23 17:33:21.375338] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:39:26.330 [2024-07-23 17:33:21.375350] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:26.330 [2024-07-23 17:33:21.383336] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:39:26.330 [2024-07-23 17:33:21.383355] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:39:26.330 [2024-07-23 17:33:21.383367] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:26.330 [2024-07-23 17:33:21.391357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:39:26.330 [2024-07-23 17:33:21.391376] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:39:26.330 [2024-07-23 17:33:21.391387] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:26.330 [2024-07-23 17:33:21.469373] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:39:26.330 [2024-07-23 17:33:21.469420] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:39:26.330 [2024-07-23 17:33:21.469439] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:39:26.330 [2024-07-23 17:33:21.470717] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:39:26.330 [2024-07-23 17:33:21.470805] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:39:26.330 [2024-07-23 17:33:21.470823] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:39:26.330 [2024-07-23 17:33:21.470866] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:39:26.330 00:39:26.330 [2024-07-23 17:33:21.470885] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:39:26.588 00:39:26.588 real 0m3.288s 00:39:26.588 user 0m2.688s 00:39:26.588 sys 0m0.565s 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:39:26.588 ************************************ 00:39:26.588 END TEST bdev_hello_world 00:39:26.588 ************************************ 00:39:26.588 17:33:21 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:39:26.588 17:33:21 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:39:26.588 17:33:21 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:39:26.588 17:33:21 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:26.588 17:33:21 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:26.588 ************************************ 00:39:26.588 START TEST bdev_bounds 00:39:26.588 ************************************ 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=125064 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 125064' 00:39:26.588 Process bdevio pid: 125064 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 125064 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 125064 ']' 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:39:26.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:39:26.588 17:33:21 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:39:26.847 [2024-07-23 17:33:22.067475] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:39:26.847 [2024-07-23 17:33:22.067612] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid125064 ] 00:39:27.105 [2024-07-23 17:33:22.269669] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:39:27.105 [2024-07-23 17:33:22.321822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:39:27.105 [2024-07-23 17:33:22.321936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:39:27.105 [2024-07-23 17:33:22.321938] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:27.105 [2024-07-23 17:33:22.343443] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:39:27.106 [2024-07-23 17:33:22.351457] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:39:27.106 [2024-07-23 17:33:22.359474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:39:27.106 [2024-07-23 17:33:22.459225] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:39:29.637 [2024-07-23 17:33:24.833062] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:39:29.637 [2024-07-23 17:33:24.833142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:39:29.637 [2024-07-23 17:33:24.833158] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:29.637 [2024-07-23 17:33:24.841096] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:39:29.637 [2024-07-23 17:33:24.841115] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:39:29.637 [2024-07-23 17:33:24.841127] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:29.637 [2024-07-23 17:33:24.849105] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:39:29.637 [2024-07-23 17:33:24.849122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:39:29.637 [2024-07-23 17:33:24.849134] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:29.637 [2024-07-23 17:33:24.857125] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:39:29.637 [2024-07-23 17:33:24.857142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:39:29.637 [2024-07-23 17:33:24.857154] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:29.637 17:33:24 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:39:29.637 17:33:24 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:39:29.637 17:33:24 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:39:29.896 I/O targets: 00:39:29.896 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:39:29.896 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:39:29.896 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:39:29.896 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:39:29.896 00:39:29.896 00:39:29.896 CUnit - A unit testing framework for C - Version 2.1-3 00:39:29.896 http://cunit.sourceforge.net/ 00:39:29.896 00:39:29.896 00:39:29.896 Suite: bdevio tests on: crypto_ram4 00:39:29.896 Test: blockdev write read block ...passed 00:39:29.896 Test: blockdev write zeroes read block ...passed 00:39:29.896 Test: blockdev write zeroes read no split ...passed 00:39:29.896 Test: blockdev write zeroes read split ...passed 00:39:29.896 Test: blockdev write zeroes read split partial ...passed 00:39:29.896 Test: blockdev reset ...passed 00:39:29.896 Test: blockdev write read 8 blocks ...passed 00:39:29.896 Test: blockdev write read size > 128k ...passed 00:39:29.896 Test: blockdev write read invalid size ...passed 00:39:29.896 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:39:29.896 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:39:29.896 Test: blockdev write read max offset ...passed 00:39:29.896 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:39:29.896 Test: blockdev writev readv 8 blocks ...passed 00:39:29.896 Test: blockdev writev readv 30 x 1block ...passed 00:39:29.896 Test: blockdev writev readv block ...passed 00:39:29.896 Test: blockdev writev readv size > 128k ...passed 00:39:29.896 Test: blockdev writev readv size > 128k in two iovs ...passed 00:39:29.896 Test: blockdev comparev and writev ...passed 00:39:29.896 Test: blockdev nvme passthru rw ...passed 00:39:29.896 Test: blockdev nvme passthru vendor specific ...passed 00:39:29.896 Test: blockdev nvme admin passthru ...passed 00:39:29.896 Test: blockdev copy ...passed 00:39:29.896 Suite: bdevio tests on: crypto_ram3 00:39:29.896 Test: blockdev write read block ...passed 00:39:29.896 Test: blockdev write zeroes read block ...passed 00:39:29.896 Test: blockdev write zeroes read no split ...passed 00:39:29.896 Test: blockdev write zeroes read split ...passed 00:39:29.896 Test: blockdev write zeroes read split partial ...passed 00:39:29.896 Test: blockdev reset ...passed 00:39:29.896 Test: blockdev write read 8 blocks ...passed 00:39:29.896 Test: blockdev write read size > 128k ...passed 00:39:29.896 Test: blockdev write read invalid size ...passed 00:39:29.896 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:39:29.896 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:39:29.896 Test: blockdev write read max offset ...passed 00:39:29.896 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:39:29.896 Test: blockdev writev readv 8 blocks ...passed 00:39:29.896 Test: blockdev writev readv 30 x 1block ...passed 00:39:29.896 Test: blockdev writev readv block ...passed 00:39:29.896 Test: blockdev writev readv size > 128k ...passed 00:39:29.896 Test: blockdev writev readv size > 128k in two iovs ...passed 00:39:29.896 Test: blockdev comparev and writev ...passed 00:39:29.896 Test: blockdev nvme passthru rw ...passed 00:39:29.896 Test: blockdev nvme passthru vendor specific ...passed 00:39:29.896 Test: blockdev nvme admin passthru ...passed 00:39:29.896 Test: blockdev copy ...passed 00:39:29.896 Suite: bdevio tests on: crypto_ram2 00:39:29.896 Test: blockdev write read block ...passed 00:39:29.896 Test: blockdev write zeroes read block ...passed 00:39:29.896 Test: blockdev write zeroes read no split ...passed 00:39:30.155 Test: blockdev write zeroes read split ...passed 00:39:30.155 Test: blockdev write zeroes read split partial ...passed 00:39:30.155 Test: blockdev reset ...passed 00:39:30.155 Test: blockdev write read 8 blocks ...passed 00:39:30.155 Test: blockdev write read size > 128k ...passed 00:39:30.155 Test: blockdev write read invalid size ...passed 00:39:30.155 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:39:30.155 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:39:30.155 Test: blockdev write read max offset ...passed 00:39:30.155 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:39:30.155 Test: blockdev writev readv 8 blocks ...passed 00:39:30.155 Test: blockdev writev readv 30 x 1block ...passed 00:39:30.155 Test: blockdev writev readv block ...passed 00:39:30.155 Test: blockdev writev readv size > 128k ...passed 00:39:30.155 Test: blockdev writev readv size > 128k in two iovs ...passed 00:39:30.155 Test: blockdev comparev and writev ...passed 00:39:30.155 Test: blockdev nvme passthru rw ...passed 00:39:30.155 Test: blockdev nvme passthru vendor specific ...passed 00:39:30.155 Test: blockdev nvme admin passthru ...passed 00:39:30.155 Test: blockdev copy ...passed 00:39:30.155 Suite: bdevio tests on: crypto_ram 00:39:30.155 Test: blockdev write read block ...passed 00:39:30.155 Test: blockdev write zeroes read block ...passed 00:39:30.413 Test: blockdev write zeroes read no split ...passed 00:39:30.413 Test: blockdev write zeroes read split ...passed 00:39:30.672 Test: blockdev write zeroes read split partial ...passed 00:39:30.672 Test: blockdev reset ...passed 00:39:30.672 Test: blockdev write read 8 blocks ...passed 00:39:30.672 Test: blockdev write read size > 128k ...passed 00:39:30.672 Test: blockdev write read invalid size ...passed 00:39:30.672 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:39:30.672 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:39:30.672 Test: blockdev write read max offset ...passed 00:39:30.672 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:39:30.672 Test: blockdev writev readv 8 blocks ...passed 00:39:30.672 Test: blockdev writev readv 30 x 1block ...passed 00:39:30.672 Test: blockdev writev readv block ...passed 00:39:30.672 Test: blockdev writev readv size > 128k ...passed 00:39:30.672 Test: blockdev writev readv size > 128k in two iovs ...passed 00:39:30.672 Test: blockdev comparev and writev ...passed 00:39:30.672 Test: blockdev nvme passthru rw ...passed 00:39:30.672 Test: blockdev nvme passthru vendor specific ...passed 00:39:30.672 Test: blockdev nvme admin passthru ...passed 00:39:30.672 Test: blockdev copy ...passed 00:39:30.672 00:39:30.672 Run Summary: Type Total Ran Passed Failed Inactive 00:39:30.672 suites 4 4 n/a 0 0 00:39:30.672 tests 92 92 92 0 0 00:39:30.672 asserts 520 520 520 0 n/a 00:39:30.672 00:39:30.672 Elapsed time = 1.587 seconds 00:39:30.672 0 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 125064 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 125064 ']' 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 125064 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 125064 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:39:30.672 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:39:30.673 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 125064' 00:39:30.673 killing process with pid 125064 00:39:30.673 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 125064 00:39:30.673 17:33:25 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 125064 00:39:30.932 17:33:26 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:39:30.932 00:39:30.932 real 0m4.362s 00:39:30.932 user 0m11.446s 00:39:30.932 sys 0m0.829s 00:39:30.932 17:33:26 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:39:30.932 17:33:26 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:39:30.932 ************************************ 00:39:30.932 END TEST bdev_bounds 00:39:30.932 ************************************ 00:39:31.191 17:33:26 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:39:31.191 17:33:26 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:39:31.191 17:33:26 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:39:31.191 17:33:26 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:31.191 17:33:26 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:31.191 ************************************ 00:39:31.191 START TEST bdev_nbd 00:39:31.191 ************************************ 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=125624 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 125624 /var/tmp/spdk-nbd.sock 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 125624 ']' 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:39:31.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:39:31.191 17:33:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:39:31.191 [2024-07-23 17:33:26.486104] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:39:31.191 [2024-07-23 17:33:26.486172] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:39:31.450 [2024-07-23 17:33:26.618650] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:39:31.450 [2024-07-23 17:33:26.670004] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:39:31.450 [2024-07-23 17:33:26.691330] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:39:31.450 [2024-07-23 17:33:26.699359] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:39:31.450 [2024-07-23 17:33:26.707377] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:39:31.450 [2024-07-23 17:33:26.811910] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:39:33.995 [2024-07-23 17:33:29.198625] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:39:33.996 [2024-07-23 17:33:29.198687] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:39:33.996 [2024-07-23 17:33:29.198702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:33.996 [2024-07-23 17:33:29.206643] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:39:33.996 [2024-07-23 17:33:29.206664] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:39:33.996 [2024-07-23 17:33:29.206680] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:33.996 [2024-07-23 17:33:29.214664] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:39:33.996 [2024-07-23 17:33:29.214683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:39:33.996 [2024-07-23 17:33:29.214695] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:33.996 [2024-07-23 17:33:29.222685] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:39:33.996 [2024-07-23 17:33:29.222704] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:39:33.996 [2024-07-23 17:33:29.222715] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:39:33.996 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:34.254 1+0 records in 00:39:34.254 1+0 records out 00:39:34.254 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305481 s, 13.4 MB/s 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:39:34.254 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:34.513 1+0 records in 00:39:34.513 1+0 records out 00:39:34.513 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000363291 s, 11.3 MB/s 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:39:34.513 17:33:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:35.079 1+0 records in 00:39:35.079 1+0 records out 00:39:35.079 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310716 s, 13.2 MB/s 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:39:35.079 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:35.337 1+0 records in 00:39:35.337 1+0 records out 00:39:35.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000392603 s, 10.4 MB/s 00:39:35.337 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:39:35.595 17:33:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:39:35.595 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd0", 00:39:35.595 "bdev_name": "crypto_ram" 00:39:35.595 }, 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd1", 00:39:35.595 "bdev_name": "crypto_ram2" 00:39:35.595 }, 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd2", 00:39:35.595 "bdev_name": "crypto_ram3" 00:39:35.595 }, 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd3", 00:39:35.595 "bdev_name": "crypto_ram4" 00:39:35.595 } 00:39:35.595 ]' 00:39:35.595 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:39:35.595 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd0", 00:39:35.595 "bdev_name": "crypto_ram" 00:39:35.595 }, 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd1", 00:39:35.595 "bdev_name": "crypto_ram2" 00:39:35.595 }, 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd2", 00:39:35.595 "bdev_name": "crypto_ram3" 00:39:35.595 }, 00:39:35.595 { 00:39:35.595 "nbd_device": "/dev/nbd3", 00:39:35.595 "bdev_name": "crypto_ram4" 00:39:35.595 } 00:39:35.595 ]' 00:39:35.595 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:35.853 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:36.111 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:36.370 17:33:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:39:36.936 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:37.195 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:39:37.453 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:39:37.712 /dev/nbd0 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:37.712 1+0 records in 00:39:37.712 1+0 records out 00:39:37.712 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306331 s, 13.4 MB/s 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:39:37.712 17:33:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:39:37.971 /dev/nbd1 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:37.971 1+0 records in 00:39:37.971 1+0 records out 00:39:37.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312573 s, 13.1 MB/s 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:39:37.971 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:39:38.228 /dev/nbd10 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:38.228 1+0 records in 00:39:38.228 1+0 records out 00:39:38.228 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279961 s, 14.6 MB/s 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:39:38.228 17:33:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:39:38.794 /dev/nbd11 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:39:38.794 1+0 records in 00:39:38.794 1+0 records out 00:39:38.794 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000361949 s, 11.3 MB/s 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:38.794 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd0", 00:39:39.052 "bdev_name": "crypto_ram" 00:39:39.052 }, 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd1", 00:39:39.052 "bdev_name": "crypto_ram2" 00:39:39.052 }, 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd10", 00:39:39.052 "bdev_name": "crypto_ram3" 00:39:39.052 }, 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd11", 00:39:39.052 "bdev_name": "crypto_ram4" 00:39:39.052 } 00:39:39.052 ]' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd0", 00:39:39.052 "bdev_name": "crypto_ram" 00:39:39.052 }, 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd1", 00:39:39.052 "bdev_name": "crypto_ram2" 00:39:39.052 }, 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd10", 00:39:39.052 "bdev_name": "crypto_ram3" 00:39:39.052 }, 00:39:39.052 { 00:39:39.052 "nbd_device": "/dev/nbd11", 00:39:39.052 "bdev_name": "crypto_ram4" 00:39:39.052 } 00:39:39.052 ]' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:39:39.052 /dev/nbd1 00:39:39.052 /dev/nbd10 00:39:39.052 /dev/nbd11' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:39:39.052 /dev/nbd1 00:39:39.052 /dev/nbd10 00:39:39.052 /dev/nbd11' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:39:39.052 256+0 records in 00:39:39.052 256+0 records out 00:39:39.052 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115655 s, 90.7 MB/s 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:39:39.052 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:39:39.310 256+0 records in 00:39:39.310 256+0 records out 00:39:39.310 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0622248 s, 16.9 MB/s 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:39:39.310 256+0 records in 00:39:39.310 256+0 records out 00:39:39.310 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0658719 s, 15.9 MB/s 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:39:39.310 256+0 records in 00:39:39.310 256+0 records out 00:39:39.310 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0594214 s, 17.6 MB/s 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:39:39.310 256+0 records in 00:39:39.310 256+0 records out 00:39:39.310 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0581923 s, 18.0 MB/s 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:39:39.310 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:39.568 17:33:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:39.826 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:40.085 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:40.344 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:40.602 17:33:35 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:39:40.860 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:39:41.119 malloc_lvol_verify 00:39:41.119 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:39:41.378 391e8724-0bbb-45ca-aac5-4d87a858f9cc 00:39:41.378 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:39:41.635 87e66de7-c0c6-4d83-8d47-db1992112595 00:39:41.635 17:33:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:39:41.893 /dev/nbd0 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:39:41.893 mke2fs 1.46.5 (30-Dec-2021) 00:39:41.893 Discarding device blocks: 0/4096 done 00:39:41.893 Creating filesystem with 4096 1k blocks and 1024 inodes 00:39:41.893 00:39:41.893 Allocating group tables: 0/1 done 00:39:41.893 Writing inode tables: 0/1 done 00:39:41.893 Creating journal (1024 blocks): done 00:39:41.893 Writing superblocks and filesystem accounting information: 0/1 done 00:39:41.893 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:39:41.893 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 125624 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 125624 ']' 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 125624 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 125624 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 125624' 00:39:42.152 killing process with pid 125624 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 125624 00:39:42.152 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 125624 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:39:42.719 00:39:42.719 real 0m11.471s 00:39:42.719 user 0m14.925s 00:39:42.719 sys 0m4.814s 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:39:42.719 ************************************ 00:39:42.719 END TEST bdev_nbd 00:39:42.719 ************************************ 00:39:42.719 17:33:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:39:42.719 17:33:37 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:39:42.719 17:33:37 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:39:42.719 17:33:37 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:39:42.719 17:33:37 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:39:42.719 17:33:37 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:39:42.719 17:33:37 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:42.719 17:33:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:39:42.719 ************************************ 00:39:42.719 START TEST bdev_fio 00:39:42.719 ************************************ 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:39:42.719 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:39:42.719 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:39:42.720 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:39:42.720 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:39:42.720 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:39:42.720 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:39:42.720 17:33:37 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:39:42.720 ************************************ 00:39:42.720 START TEST bdev_fio_rw_verify 00:39:42.720 ************************************ 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:39:42.720 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:39:42.988 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:39:42.988 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:39:42.988 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:39:42.988 17:33:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:43.248 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:43.248 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:43.248 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:43.248 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:43.248 fio-3.35 00:39:43.248 Starting 4 threads 00:39:58.132 00:39:58.132 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=127818: Tue Jul 23 17:33:51 2024 00:39:58.132 read: IOPS=20.0k, BW=78.0MiB/s (81.8MB/s)(781MiB/10001msec) 00:39:58.132 slat (usec): min=10, max=485, avg=69.46, stdev=40.78 00:39:58.132 clat (usec): min=12, max=1874, avg=368.25, stdev=258.69 00:39:58.132 lat (usec): min=34, max=2106, avg=437.71, stdev=285.56 00:39:58.132 clat percentiles (usec): 00:39:58.132 | 50.000th=[ 306], 99.000th=[ 1221], 99.900th=[ 1663], 99.990th=[ 1778], 00:39:58.132 | 99.999th=[ 1876] 00:39:58.132 write: IOPS=21.9k, BW=85.4MiB/s (89.6MB/s)(834MiB/9769msec); 0 zone resets 00:39:58.132 slat (usec): min=18, max=1747, avg=80.04, stdev=40.73 00:39:58.132 clat (usec): min=28, max=2753, avg=429.87, stdev=295.05 00:39:58.132 lat (usec): min=51, max=2910, avg=509.91, stdev=322.06 00:39:58.132 clat percentiles (usec): 00:39:58.132 | 50.000th=[ 367], 99.000th=[ 1450], 99.900th=[ 2057], 99.990th=[ 2180], 00:39:58.132 | 99.999th=[ 2474] 00:39:58.132 bw ( KiB/s): min=66144, max=122452, per=97.46%, avg=85246.11, stdev=3639.08, samples=76 00:39:58.132 iops : min=16536, max=30613, avg=21311.53, stdev=909.77, samples=76 00:39:58.132 lat (usec) : 20=0.01%, 50=0.33%, 100=6.13%, 250=27.55%, 500=40.96% 00:39:58.132 lat (usec) : 750=13.88%, 1000=6.42% 00:39:58.132 lat (msec) : 2=4.63%, 4=0.09% 00:39:58.132 cpu : usr=99.61%, sys=0.00%, ctx=69, majf=0, minf=338 00:39:58.132 IO depths : 1=10.3%, 2=25.4%, 4=51.1%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:39:58.132 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:39:58.132 complete : 0=0.0%, 4=88.8%, 8=11.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:39:58.132 issued rwts: total=199814,213608,0,0 short=0,0,0,0 dropped=0,0,0,0 00:39:58.132 latency : target=0, window=0, percentile=100.00%, depth=8 00:39:58.132 00:39:58.132 Run status group 0 (all jobs): 00:39:58.132 READ: bw=78.0MiB/s (81.8MB/s), 78.0MiB/s-78.0MiB/s (81.8MB/s-81.8MB/s), io=781MiB (818MB), run=10001-10001msec 00:39:58.132 WRITE: bw=85.4MiB/s (89.6MB/s), 85.4MiB/s-85.4MiB/s (89.6MB/s-89.6MB/s), io=834MiB (875MB), run=9769-9769msec 00:39:58.132 00:39:58.132 real 0m13.623s 00:39:58.132 user 0m46.025s 00:39:58.132 sys 0m0.663s 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:39:58.132 ************************************ 00:39:58.132 END TEST bdev_fio_rw_verify 00:39:58.132 ************************************ 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ae2d32c2-366c-597e-8fb7-c7a860bb7ad2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae2d32c2-366c-597e-8fb7-c7a860bb7ad2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5a63c159-c261-5b1e-b6c2-886203ec6d2e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5a63c159-c261-5b1e-b6c2-886203ec6d2e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "52e55600-9473-578a-957f-224a4a20aca4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "52e55600-9473-578a-957f-224a4a20aca4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "cd4b2ad2-4f1d-5f04-aa3d-ad4fb0592169"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cd4b2ad2-4f1d-5f04-aa3d-ad4fb0592169",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:39:58.132 crypto_ram2 00:39:58.132 crypto_ram3 00:39:58.132 crypto_ram4 ]] 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ae2d32c2-366c-597e-8fb7-c7a860bb7ad2"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ae2d32c2-366c-597e-8fb7-c7a860bb7ad2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "5a63c159-c261-5b1e-b6c2-886203ec6d2e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5a63c159-c261-5b1e-b6c2-886203ec6d2e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "52e55600-9473-578a-957f-224a4a20aca4"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "52e55600-9473-578a-957f-224a4a20aca4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "cd4b2ad2-4f1d-5f04-aa3d-ad4fb0592169"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cd4b2ad2-4f1d-5f04-aa3d-ad4fb0592169",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:39:58.132 ************************************ 00:39:58.132 START TEST bdev_fio_trim 00:39:58.132 ************************************ 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:39:58.132 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:39:58.133 17:33:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:39:58.133 17:33:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:39:58.133 17:33:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:39:58.133 17:33:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:39:58.133 17:33:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:58.133 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:58.133 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:58.133 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:58.133 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:39:58.133 fio-3.35 00:39:58.133 Starting 4 threads 00:40:10.345 00:40:10.345 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=129687: Tue Jul 23 17:34:05 2024 00:40:10.345 write: IOPS=37.4k, BW=146MiB/s (153MB/s)(1460MiB/10001msec); 0 zone resets 00:40:10.345 slat (usec): min=18, max=1924, avg=59.78, stdev=36.83 00:40:10.345 clat (usec): min=96, max=2668, avg=491.35, stdev=167.22 00:40:10.345 lat (usec): min=130, max=2695, avg=551.13, stdev=171.94 00:40:10.345 clat percentiles (usec): 00:40:10.345 | 50.000th=[ 510], 99.000th=[ 988], 99.900th=[ 1188], 99.990th=[ 1303], 00:40:10.345 | 99.999th=[ 2606] 00:40:10.345 bw ( KiB/s): min=137912, max=203728, per=100.00%, avg=150037.63, stdev=6462.15, samples=76 00:40:10.345 iops : min=34478, max=50932, avg=37509.37, stdev=1615.52, samples=76 00:40:10.345 trim: IOPS=37.4k, BW=146MiB/s (153MB/s)(1460MiB/10001msec); 0 zone resets 00:40:10.345 slat (nsec): min=6287, max=68353, avg=16172.23, stdev=6549.29 00:40:10.345 clat (usec): min=33, max=2048, avg=153.44, stdev=144.72 00:40:10.345 lat (usec): min=42, max=2081, avg=169.61, stdev=147.46 00:40:10.345 clat percentiles (usec): 00:40:10.345 | 50.000th=[ 83], 99.000th=[ 685], 99.900th=[ 816], 99.990th=[ 898], 00:40:10.345 | 99.999th=[ 1254] 00:40:10.345 bw ( KiB/s): min=137904, max=203736, per=100.00%, avg=150043.95, stdev=6464.80, samples=76 00:40:10.345 iops : min=34476, max=50934, avg=37510.95, stdev=1616.18, samples=76 00:40:10.345 lat (usec) : 50=2.20%, 100=27.81%, 250=15.21%, 500=26.48%, 750=26.42% 00:40:10.345 lat (usec) : 1000=1.43% 00:40:10.345 lat (msec) : 2=0.45%, 4=0.01% 00:40:10.345 cpu : usr=99.60%, sys=0.00%, ctx=60, majf=0, minf=175 00:40:10.345 IO depths : 1=0.1%, 2=10.9%, 4=52.7%, 8=36.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:40:10.346 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:40:10.346 complete : 0=0.0%, 4=96.0%, 8=4.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:40:10.346 issued rwts: total=0,373687,373688,0 short=0,0,0,0 dropped=0,0,0,0 00:40:10.346 latency : target=0, window=0, percentile=100.00%, depth=8 00:40:10.346 00:40:10.346 Run status group 0 (all jobs): 00:40:10.346 WRITE: bw=146MiB/s (153MB/s), 146MiB/s-146MiB/s (153MB/s-153MB/s), io=1460MiB (1531MB), run=10001-10001msec 00:40:10.346 TRIM: bw=146MiB/s (153MB/s), 146MiB/s-146MiB/s (153MB/s-153MB/s), io=1460MiB (1531MB), run=10001-10001msec 00:40:10.346 00:40:10.346 real 0m13.693s 00:40:10.346 user 0m45.838s 00:40:10.346 sys 0m0.663s 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:40:10.346 ************************************ 00:40:10.346 END TEST bdev_fio_trim 00:40:10.346 ************************************ 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:40:10.346 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:40:10.346 00:40:10.346 real 0m27.686s 00:40:10.346 user 1m32.040s 00:40:10.346 sys 0m1.541s 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:40:10.346 ************************************ 00:40:10.346 END TEST bdev_fio 00:40:10.346 ************************************ 00:40:10.346 17:34:05 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:40:10.346 17:34:05 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:40:10.346 17:34:05 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:40:10.346 17:34:05 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:40:10.346 17:34:05 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:10.346 17:34:05 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:40:10.346 ************************************ 00:40:10.346 START TEST bdev_verify 00:40:10.346 ************************************ 00:40:10.346 17:34:05 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:40:10.605 [2024-07-23 17:34:05.816383] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:10.605 [2024-07-23 17:34:05.816448] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid130942 ] 00:40:10.605 [2024-07-23 17:34:05.948740] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:40:10.605 [2024-07-23 17:34:06.004718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:40:10.605 [2024-07-23 17:34:06.004724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:10.605 [2024-07-23 17:34:06.026252] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:40:10.865 [2024-07-23 17:34:06.034282] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:40:10.865 [2024-07-23 17:34:06.042303] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:40:10.865 [2024-07-23 17:34:06.156718] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:40:13.401 [2024-07-23 17:34:08.537396] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:40:13.401 [2024-07-23 17:34:08.537480] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:40:13.401 [2024-07-23 17:34:08.537495] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:13.401 [2024-07-23 17:34:08.545410] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:40:13.401 [2024-07-23 17:34:08.545429] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:40:13.401 [2024-07-23 17:34:08.545441] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:13.401 [2024-07-23 17:34:08.553432] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:40:13.401 [2024-07-23 17:34:08.553451] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:40:13.401 [2024-07-23 17:34:08.553462] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:13.401 [2024-07-23 17:34:08.561456] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:40:13.401 [2024-07-23 17:34:08.561474] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:40:13.401 [2024-07-23 17:34:08.561485] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:13.401 Running I/O for 5 seconds... 00:40:18.676 00:40:18.676 Latency(us) 00:40:18.676 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:18.676 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x0 length 0x1000 00:40:18.676 crypto_ram : 5.06 923.04 3.61 0.00 0.00 137728.89 9573.95 93004.13 00:40:18.676 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x1000 length 0x1000 00:40:18.676 crypto_ram : 5.08 756.06 2.95 0.00 0.00 167314.64 6069.20 109416.63 00:40:18.676 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x0 length 0x1000 00:40:18.676 crypto_ram2 : 5.06 931.95 3.64 0.00 0.00 136561.90 3903.67 93460.03 00:40:18.676 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x1000 length 0x1000 00:40:18.676 crypto_ram2 : 5.08 751.74 2.94 0.00 0.00 169409.44 7579.38 108960.72 00:40:18.676 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x0 length 0x1000 00:40:18.676 crypto_ram3 : 5.05 2937.63 11.48 0.00 0.00 43196.90 4274.09 34876.55 00:40:18.676 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x1000 length 0x1000 00:40:18.676 crypto_ram3 : 5.06 2375.72 9.28 0.00 0.00 53538.40 3419.27 59723.24 00:40:18.676 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x0 length 0x1000 00:40:18.676 crypto_ram4 : 5.06 2937.02 11.47 0.00 0.00 43098.12 4729.99 39207.62 00:40:18.676 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:40:18.676 Verification LBA range: start 0x1000 length 0x1000 00:40:18.676 crypto_ram4 : 5.07 2373.78 9.27 0.00 0.00 53400.50 7180.47 49009.53 00:40:18.676 =================================================================================================================== 00:40:18.676 Total : 13986.96 54.64 0.00 0.00 72661.46 3419.27 109416.63 00:40:18.935 00:40:18.935 real 0m8.406s 00:40:18.935 user 0m15.818s 00:40:18.935 sys 0m0.551s 00:40:18.935 17:34:14 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:18.935 17:34:14 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:40:18.935 ************************************ 00:40:18.935 END TEST bdev_verify 00:40:18.935 ************************************ 00:40:18.935 17:34:14 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:40:18.935 17:34:14 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:40:18.935 17:34:14 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:40:18.935 17:34:14 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:18.935 17:34:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:40:18.935 ************************************ 00:40:18.935 START TEST bdev_verify_big_io 00:40:18.935 ************************************ 00:40:18.935 17:34:14 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:40:18.935 [2024-07-23 17:34:14.307767] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:18.935 [2024-07-23 17:34:14.307837] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid132012 ] 00:40:19.194 [2024-07-23 17:34:14.442832] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:40:19.194 [2024-07-23 17:34:14.498593] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:40:19.194 [2024-07-23 17:34:14.498598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:19.194 [2024-07-23 17:34:14.520040] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:40:19.194 [2024-07-23 17:34:14.528069] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:40:19.194 [2024-07-23 17:34:14.536088] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:40:19.454 [2024-07-23 17:34:14.640697] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:40:21.991 [2024-07-23 17:34:17.026451] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:40:21.991 [2024-07-23 17:34:17.026540] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:40:21.991 [2024-07-23 17:34:17.026555] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:21.991 [2024-07-23 17:34:17.034469] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:40:21.991 [2024-07-23 17:34:17.034489] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:40:21.991 [2024-07-23 17:34:17.034500] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:21.991 [2024-07-23 17:34:17.042492] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:40:21.991 [2024-07-23 17:34:17.042510] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:40:21.991 [2024-07-23 17:34:17.042522] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:21.991 [2024-07-23 17:34:17.050518] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:40:21.991 [2024-07-23 17:34:17.050536] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:40:21.991 [2024-07-23 17:34:17.050548] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:21.991 Running I/O for 5 seconds... 00:40:22.559 [2024-07-23 17:34:17.966282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.966870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.967094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.967162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.967229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.967284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.967760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.967780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.969587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.969686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.969751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.969804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.970386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.970446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.970516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.970578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.971062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.971083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.972442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.972506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.972558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.972610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.973251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.973313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.973367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.973419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.973840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.973860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.975463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.975530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.975582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.975634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.976219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.976292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.976347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.559 [2024-07-23 17:34:17.976413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.976870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.976891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.978137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.978212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.978279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.978364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.978974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.979033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.979086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.979151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.979663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.560 [2024-07-23 17:34:17.979683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.981213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.981277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.981350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.981402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.982021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.982094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.982148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.982201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.982560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.982580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.984277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.984354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.984418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.984470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.985028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.985086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.985157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.985210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.985661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.985681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.986972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.987902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.988279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.988299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.989565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.989649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.989701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.989753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.990299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.990362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.990420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.990473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.990789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.990808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.992035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.992096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.821 [2024-07-23 17:34:17.992147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.992199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.992700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.992756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.992808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.992859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.993183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.993203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.994503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.994565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.994617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.994671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.995149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.995210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.995265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.995315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.995628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.995648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.996985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.997805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.998128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.998149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.999503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.999570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.999621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:17.999672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.000192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.000250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.000302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.000361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.000674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.000694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.001850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.001919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.001971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.002023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.002491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.002548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.002610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.002661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.003021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.003042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.004814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.004875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.004932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.004984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.005485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.005550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.005610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.005661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.005981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.006001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.007259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.007321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.007372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.007423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.007911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.007969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.008020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.008071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.008515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.008535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.010848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.011172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.011192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.012574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.012636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.012690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.012751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.013226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.013285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.013336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.013392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.013878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.013905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.015023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.015085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.015143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.822 [2024-07-23 17:34:18.015193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.015660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.015717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.015769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.015822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.016149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.016170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.017403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.017468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.017519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.017571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.018043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.018100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.018160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.018214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.018701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.018721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.020824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.021227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.021248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.022374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.022435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.022487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.022539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.023016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.023075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.023128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.023180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.023668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.023691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.025931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.026396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.026416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.027557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.027619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.027670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.027722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.028200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.028258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.028310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.028379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.028867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.028888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.030911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.031344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.031364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.032488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.032552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.032615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.032666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.033190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.033252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.033304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.033356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.033812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.033832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.034963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.035773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.036152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.036172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.037351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.037412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.037464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.037515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.038088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.038147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.823 [2024-07-23 17:34:18.038199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.038253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.038714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.038734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.039823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.039885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.039945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.039996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.040466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.040532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.040586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.040638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.040959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.040979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.042135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.042197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.042248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.042299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.042924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.042985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.043038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.043090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.043600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.043625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.044714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.044775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.044827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.044878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.045352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.045410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.045462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.045514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.045830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.045849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.047145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.047209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.047262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.047293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.047311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.048187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.048250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.048304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.048359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.048684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.048704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.050722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.052456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.054215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.055975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.056857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.057532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.059086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.060837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.061158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.061183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.064028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.065785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.067117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.067610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.069650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.071407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.073162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.074266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.074583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.074602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.076775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.077280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.078496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.080054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.082225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.083260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.084809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.086557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.086876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.086901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.089697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.091355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.093153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.094719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.096731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.098492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.100292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.100785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.101332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.101359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.104105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.105244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.107050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.108851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.110729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.111233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.112160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.824 [2024-07-23 17:34:18.113700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.114023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.114043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.116767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.118529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.120282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.121014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.122923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.124484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.126227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.128023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.128478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.128498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.131312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.131821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.132315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.133887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.136061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.137177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.138978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.140782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.141105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.141125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.143283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.144842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.146602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.148359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.150446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.152204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.153954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.154842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.155283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.155304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.158115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.159922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.161051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.162596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.164768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.165276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.165767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.167550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.167869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.167889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.170623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.172400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.174159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.175625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.176809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.178369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.180131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.181883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.182328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.182348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.185145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.186105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.186596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.187989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.190234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.192040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.193177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.194728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.195050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.195070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.196725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.198532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.200302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.202060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.204076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.205715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.207514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.209063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.209567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.209587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.212667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.214427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.215490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.217214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.219385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.220685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.221204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.222262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.222652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.222672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.224679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.226254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.228007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.229756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.230845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.232538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.825 [2024-07-23 17:34:18.234247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.826 [2024-07-23 17:34:18.236019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.826 [2024-07-23 17:34:18.236337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.826 [2024-07-23 17:34:18.236357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:22.826 [2024-07-23 17:34:18.239202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.240900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.241393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.242096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.244294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.246058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.246956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.248502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.248819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.248839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.250396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.251722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.253272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.254956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.257315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.259073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.260667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.261181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.261696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.261717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.263527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.264041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.264534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.265030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.265982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.266482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.266979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.267474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.267961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.267982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.269592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.270098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.270591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.271088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.272160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.272665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.273161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.273653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.274144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.274165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.276020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.276525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.088 [2024-07-23 17:34:18.277023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.277514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.278553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.279067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.279575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.280074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.280525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.280545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.282107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.282607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.283118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.283609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.284647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.285156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.285645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.286155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.286719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.286739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.288388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.288887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.289384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.289878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.290926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.291437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.291950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.292440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.292910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.292931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.294839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.295349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.295859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.296357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.297455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.297959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.298457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.298963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.299436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.299456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.300988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.301488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.301989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.302488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.303605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.304130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.304621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.305115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.305634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.305654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.307480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.307987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.308484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.308983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.310081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.310595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.311100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.311596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.312059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.312080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.313980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.314482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.314978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.315466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.317642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.318155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.319850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.321434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.321932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.321953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.324881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.326643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.327608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.329361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.331537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.332199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.332691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.334443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.334764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.334784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.337548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.339314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.341073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.342218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.343574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.345305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.347064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.348817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.349257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.349278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.352019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.352533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.089 [2024-07-23 17:34:18.353035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.354816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.357000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.358324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.360027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.361770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.362099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.362119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.364085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.365646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.367404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.369165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.371305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.373070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.374828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.375731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.376175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.376197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.379008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.380819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.382007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.383556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.385725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.386236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.386729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.388338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.388655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.388675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.391501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.393314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.395076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.396351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.397742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.399300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.401051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.402802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.403301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.403321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.406087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.406839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.406887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.407383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.407444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.407760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.409641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.411407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.412510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.414284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.414603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.414623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.415709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.415772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.415824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.415877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.416375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.416548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.416627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.416680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.416731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.417116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.417138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.418978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.419030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.419343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.419363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.420567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.420637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.420691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.420742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.421277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.421453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.421509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.421560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.421612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.422019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.422040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.090 [2024-07-23 17:34:18.423917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.424233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.424253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.425479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.425541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.425594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.425647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.426756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.427825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.427912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.427965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.428017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.428365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.428540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.428595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.428647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.428698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.429023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.429044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.430317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.430378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.430430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.430486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.430800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.430987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.431051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.431122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.431174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.431489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.431508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.432693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.432755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.432810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.432862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.433951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.435306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.435368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.435420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.435472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.435788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.435978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.436037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.436090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.436141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.436455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.436476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.437667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.437730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.437790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.437844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.438838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.440955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.441006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.441064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.441378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.441398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.442464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.442535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.442591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.442643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.442967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.443144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.443199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.443258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.443314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.443629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.443649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.091 [2024-07-23 17:34:18.445329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.445390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.445442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.445493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.445843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.446034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.446101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.446152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.446204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.446515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.446536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.447722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.447784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.447841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.447902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.448219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.448399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.448459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.448511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.448562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.448984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.449005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.450543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.450606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.450657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.450714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.451704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.452940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.453757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.454230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.454254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.455405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.455473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.455534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.455595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.455921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.456101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.456157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.456209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.456260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.456575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.456595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.457788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.457858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.457925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.457976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.458290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.458464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.458519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.458571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.458630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.459169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.459190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.460424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.460489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.460541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.460592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.460916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.461092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.461150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.461210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.461266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.461626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.461646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.462757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.462819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.462874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.462935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.463251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.463425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.463489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.463542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.463595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.464139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.464162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.465608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.465669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.465721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.465773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.092 [2024-07-23 17:34:18.466097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.466275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.466331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.466383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.466435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.466866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.466886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.467983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.468815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.469362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.469383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.470592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.470653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.470705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.470772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.471919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.472966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.473810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.474286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.474307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.475412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.475491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.475544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.475600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.475924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.476104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.476161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.476219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.476272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.476644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.476665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.477775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.477840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.477891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.477953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.478390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.478565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.478621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.478673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.478728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.479227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.479249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.480303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.480364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.480415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.480466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.480779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.480970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.481030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.481082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.481133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.481448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.481468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.093 [2024-07-23 17:34:18.482563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.482628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.482684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.482736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.483194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.483367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.483425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.483477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.483550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.484086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.484107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.485935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.486002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.486316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.486336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.487399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.487459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.487511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.487572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.488875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.489992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.490867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.491262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.491283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.492496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.493012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.493071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.494316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.494667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.495051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.495111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.495164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.495216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.495533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.495553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.498403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.500174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.501041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.501535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.501853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.503417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.094 [2024-07-23 17:34:18.505190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.506995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.508142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.508511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.508531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.510050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.510549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.512346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.514145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.514463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.516274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.517530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.519081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.520826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.521151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.521171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.524159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.525971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.527735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.529144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.529460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.531151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.532952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.534509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.535007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.535482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.535502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.538248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.539153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.540711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.542465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.542783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.543711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.544216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.355 [2024-07-23 17:34:18.545858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.547536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.547854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.547874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.550451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.552220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.553940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.554430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.554953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.556660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.558335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.559601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.561068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.561523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.561544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.563506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.564015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.564513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.565027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.565486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.566100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.566601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.567109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.567600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.568134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.568164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.570153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.570655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.571162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.571660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.572107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.572710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.573219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.573720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.574235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.574711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.574733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.576365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.576870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.577375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.577866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.578291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.578906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.579403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.579921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.580426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.580866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.580886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.582510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.583022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.583515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.584016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.584509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.585126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.585624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.586125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.586620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.587090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.587110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.588906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.589410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.589921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.590413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.590820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.591430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.591934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.592429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.592926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.593453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.593473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.595373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.595876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.596376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.596864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.597402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.598012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.598512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.599022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.599510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.600028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.600049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.601873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.602406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.602902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.603389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.603888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.604503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.605013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.605501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.606007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.606526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.606547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.356 [2024-07-23 17:34:18.608176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.608673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.609166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.609657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.610143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.610743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.611244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.611731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.612234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.612713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.612733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.614627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.615133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.615627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.616131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.616669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.617277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.617770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.618272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.618768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.619235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.619256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.621067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.621569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.622082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.622575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.623038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.623640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.624152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.624648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.625146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.625633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.625653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.627566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.628088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.628580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.629087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.629614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.630230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.630730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.631233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.632359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.632737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.632757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.635600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.637282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.639093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.639591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.640142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.641802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.643560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.645313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.646502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.646879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.646907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.648494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.649009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.650481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.652030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.652349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.652968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.654554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.656167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.657612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.658125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.658146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.661123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.662885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.663856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.665621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.665947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.667819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.668334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.668832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.670644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.670969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.670991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.673813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.675584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.677345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.678346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.678836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.679864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.681598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.683353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.685119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.685601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.685620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.688361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.688875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.689378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.357 [2024-07-23 17:34:18.691051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.691370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.693246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.694404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.696210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.698006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.698323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.698343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.700243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.701823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.703563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.705316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.705830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.707497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.709269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.711034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.711943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.712415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.712436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.715266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.717033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.718050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.719588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.719914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.721785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.722295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.722791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.724601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.724934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.724960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.727539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.729247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.731027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.732508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.732989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.733724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.735283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.737043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.738811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.739259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.739281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.742046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.743094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.743589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.744879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.745249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.747124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.748925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.750062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.751614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.751941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.751961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.753599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.755403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.757209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.758973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.759293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.760953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.762566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.764376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.765942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.766431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.766451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.769471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.771239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.772402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.774208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.358 [2024-07-23 17:34:18.774526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.776400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.777689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.778207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.779264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.779638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.779658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.781538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.783115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.784866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.786628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.787124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.787727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.789345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.790996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.792802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.793130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.793152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.795971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.797780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.798283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.798776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.799104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.800979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.802738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.803748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.805399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.805718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.805738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.807256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.808392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.809952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.811705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.812030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.813141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.814706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.816468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.818228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.818737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.818757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.821341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.823084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.824825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.826092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.826447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.828326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.830105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.830605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.831166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.831486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.831506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.833394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.833462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.835027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.835084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.835402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.837277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.837780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.838276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.839908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.840226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.840245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.841308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.841374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.841427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.841478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.841849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.842035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.842092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.842143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.842214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.842529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.842549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.843830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.843890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.843952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.844006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.844323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.621 [2024-07-23 17:34:18.844493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.844555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.844613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.844665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.844987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.845008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.846986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.847038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.847352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.847372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.848833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.848900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.848956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.849983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.850003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.851971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.852022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.852078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.852394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.852413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.853739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.853802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.853854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.853915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.854906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.856948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.857262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.857282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.858614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.858675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.858731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.858788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.859853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.860987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.861817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.862141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.862161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.863410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.863480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.863535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.863586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.863979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.864153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.864208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.864259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.622 [2024-07-23 17:34:18.864310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.864623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.864643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.865751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.865819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.865872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.865936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.866935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.868971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.869022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.869081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.869395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.869415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.870460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.870521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.870577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.870629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.870953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.871131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.871186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.871238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.871306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.871619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.871640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.873986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.874299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.874319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.875357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.875425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.875477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.875529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.875843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.876029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.876085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.876145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.876201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.876522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.876543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.878985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.879298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.879323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.880440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.880500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.880551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.880602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.880928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.881106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.881171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.881223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.881274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.881645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.881666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.883376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.883436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.883487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.883538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.883929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.884104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.884167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.884223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.884274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.884586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.884606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.887308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.623 [2024-07-23 17:34:18.887371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.887422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.887476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.887967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.888140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.888201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.888257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.888308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.888657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.888678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.891232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.891301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.891361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.891413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.891793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.891989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.892048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.892100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.892152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.892564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.892586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.895991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.896056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.896539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.896560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.899550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.899613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.899664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.899719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.900106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.900288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.900345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.900409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.900488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.901006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.901027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.903839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.903911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.903963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.904015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.904480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.904653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.904709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.904761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.904815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.905312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.905333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.908471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.908536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.908593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.908646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.909886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.913984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.914037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.914106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.914588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.914612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.917993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.918919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.919381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.919402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.922759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.922821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.922872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.922934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.923416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.624 [2024-07-23 17:34:18.923593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.923678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.923765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.923822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.924288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.924309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.927568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.927646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.927727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.927791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.928226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.928401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.928458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.928510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.928582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.929118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.929139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.932265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.932328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.932381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.932433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.932903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.933104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.933172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.933224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.933275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.933714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.933734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.936912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.936991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.937984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.938525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.938545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.941627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.941700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.941754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.941807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.942340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.942516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.942573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.942626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.942678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.943039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.943060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.944767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.945285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.945348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.945848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.946293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.946671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.946743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.946816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.946883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.947363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.947383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.949059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.949560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.950066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.950561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.951002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.951511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.952021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.952521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.953026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.953429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.953450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.955055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.955563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.956063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.625 [2024-07-23 17:34:18.956576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.956988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.957492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.957993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.958490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.958993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.959470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.959491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.961231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.961737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.962237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.962729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.963055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.963655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.965408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.965913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.966421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.966736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.966756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.968933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.970502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.972244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.974003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.974519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.975130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.976810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.978517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.980298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.980615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.980635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.983412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.984719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.985240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.985734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.986059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.987902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.988421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.990154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.991442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.991983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.992007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.995139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.996903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.998070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.999635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:18.999959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.001826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.002339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.002833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.004419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.004787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.004807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.007760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.009475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.011113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.011608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.012112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.013965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.015718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.017466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.018442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.018759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.018778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.020385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.020903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.022635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.024388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.024705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.026132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.027861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.029620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.031383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.031755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.031776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.035006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.036770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.038523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.626 [2024-07-23 17:34:19.039513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.039831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.041700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.043468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.044413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.044914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.045241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.045265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.048120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.049334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.050883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.052632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.052955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.053555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.054064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.055810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.057566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.057884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.057910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.060718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.062496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.063847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.064344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.064813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.066473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.068238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.069994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.070885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.071210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.071230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.073046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.073558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.075081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.076625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.076949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.078736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.080105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.081669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.083427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.083746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.083765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.086758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.088560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.090326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.091714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.092037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.093714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.095513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.097128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.097618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.098113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.098133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.887 [2024-07-23 17:34:19.100926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.101920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.103540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.105296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.105613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.106857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.107362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.108492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.110044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.110363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.110382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.113089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.114851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.116611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.117113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.117513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.119163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.120762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.122558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.124189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.124568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.124587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.127465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.127997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.128592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.130139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.130457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.132323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.133252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.134807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.136561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.136879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.136903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.139265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.140820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.142579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.144341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.144772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.146433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.148195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.149959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.150777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.151220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.151240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.154077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.155844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.156997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.158548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.158866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.160734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.161243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.161738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.163538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.163857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.163876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.166396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.168058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.169858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.171410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.171912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.172516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.174103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.175849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.177608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.177991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.178011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.180751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.182150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.182645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.183481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.183865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.185721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.187478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.188365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.189928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.190247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.190266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.191834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.193166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.194722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.196479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.196797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.198071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.199620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.201375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.203134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.203573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.203593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.206144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.207898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.209611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.210905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.888 [2024-07-23 17:34:19.211270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.213142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.214948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.215442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.215941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.216259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.216279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.218489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.220300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.222088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.223825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.224149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.224750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.225473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.227030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.228780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.229108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.229128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.232001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.233780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.234714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.235213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.235538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.237199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.238957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.240713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.241827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.242230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.242251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.243703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.244214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.246019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.247818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.248142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.249606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.251302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.253011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.254783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.255107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.255128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.258195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.260001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.261760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.263138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.263458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.265172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.266979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.268525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.269021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.269523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.269543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.272284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.273378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.275109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.276868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.277194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.278542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.279050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.280094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.281647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.281969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.281988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.284687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.286458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.288208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.288905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.289310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.290694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.292166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.292972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.294430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.294749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.294769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.296511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.297022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.297520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.298489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.298885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.300310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.300991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.301488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.303082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.303577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.303597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.305262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.305764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.306270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.306763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:23.889 [2024-07-23 17:34:19.307165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.309023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.310093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.311648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.313423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.313921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.313941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.315507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.316028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.316523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.317023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.317526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.318138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.318638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.319143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.319637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.320055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.320076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.321990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.322071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.322564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.322623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.323107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.323730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.324257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.324755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.325260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.325688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.325709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.151 [2024-07-23 17:34:19.327292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.327355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.327409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.327461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.327884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.328072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.328143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.328209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.328279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.328717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.328737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.329969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.330956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.331022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.331552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.331576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.333978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.334044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.334118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.334586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.334606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.336974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.337055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.337556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.337576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.338942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.339870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.340339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.340359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.341646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.341708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.341761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.341814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.342318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.342494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.342549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.342601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.342652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.343112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.343133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.344532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.344595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.344660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.344738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.345240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.345415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.345475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.345529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.345582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.346082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.346104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.347794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.347870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.347948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.348002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.348434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.348615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.348672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.348727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.348779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.349317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.349338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.350756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.350819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.350871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.350931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.152 [2024-07-23 17:34:19.351392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.351579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.351649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.351723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.351789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.352339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.352359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.353836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.353906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.353960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.354012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.354434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.354610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.354667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.354720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.354772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.355218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.355238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.356921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.356982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.357971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.358473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.358493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.360996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.361049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.361102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.361467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.361487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.362993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.363982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.364034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.364475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.364495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.366971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.367023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.367539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.367560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.368858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.368927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.368980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.369032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.369538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.369712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.369781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.369834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.369886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.370433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.370453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.371839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.371910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.371969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.372022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.372560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.372732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.372789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.372842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.372902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.373448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.373469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.374923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.375011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.375076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.375129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.375703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.375881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.375947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.153 [2024-07-23 17:34:19.376000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.376053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.376503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.376522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.377918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.377980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.378881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.379399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.379420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.380966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.381948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.382449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.382472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.383987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.384926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.385005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.385590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.385612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.387873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.388196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.388217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.389449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.389523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.389575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.389626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.389957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.390133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.390190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.390249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.390302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.390855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.390875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.392546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.392609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.392662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.392715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.393708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.394965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.395815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.396141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.396162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.397615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.397681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.397733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.397786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.398976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.154 [2024-07-23 17:34:19.400156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.400948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.401000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.401531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.401555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.403950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.404262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.404291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.405421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.405482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.405538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.405589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.405912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.406090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.406167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.406221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.406272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.406661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.406681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.408309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.408373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.408429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.408483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.408968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.456104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.457268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.467230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.468193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.468260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.469825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.469885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.471174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.471626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.471646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.474471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.475379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.477102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.478844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.480177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.481686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.482883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.483719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.484043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.484063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.486168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.487843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.489601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.491356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.493566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.494468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.495600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.497142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.497495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.497515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.500103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.501745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.503545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.505108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.506337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.508081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.509027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.510567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.510884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.510909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.513655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.515421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.517180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.518166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.520148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.520752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.522305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.524036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.524352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.524372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.527286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.529053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.530002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.531595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.155 [2024-07-23 17:34:19.533318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.534830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.536390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.538189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.538507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.538526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.541423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.543094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.544594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.545276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.546828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.548398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.550155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.551913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.552337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.552357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.555205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.556262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.557240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.558930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.561233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.563001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.564759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.565955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.566271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.566292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.568457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.156 [2024-07-23 17:34:19.570079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.571241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.572107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.574247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.576015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.577443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.579040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.579407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.579427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.580966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.581814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.583368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.585127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.586441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.588011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.589768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.591533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.591984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.592005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.594685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.596495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.598085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.599505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.601723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.603536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.604035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.604523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.605071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.605093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.608097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.609868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.610809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.612367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.614540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.615642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.616140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.616627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.617129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.617150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.620059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.621772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.623088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.624634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.626807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.627322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.627813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.628325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.628797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.628817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.631620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.632536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.634092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.635832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.637111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.637608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.638109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.638598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.638920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.638940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.641004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.642613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.644376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.646135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.647127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.647624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.648118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.649556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.649914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.649935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.652414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.653980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.655787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.657430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.658541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.418 [2024-07-23 17:34:19.659060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.659813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.661371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.661688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.661708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.664476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.666186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.667949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.668672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.669770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.670273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.671879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.673530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.673849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.673868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.676595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.678395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.679869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.680371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.681464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.682407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.683961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.685712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.686036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.686056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.689033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.690799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.691407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.691905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.693000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.694801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.696600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.698370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.698688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.698709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.701547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.702993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.703482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.703975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.705469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.707040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.708790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.710550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.710994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.711014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.713802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.714337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.714829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.715320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.717728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.719534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.721169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.722291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.722687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.722707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.724428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.724934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.725425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.727033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.728929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.729429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.729926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.730413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.730851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.730871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.732789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.733297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.733789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.735536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.736981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.738726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.739430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.739926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.740411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.740432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.742243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.742746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.743240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.743729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.745345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.746697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.747351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.747841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.748354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.748375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.750367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.751563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.752074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.752563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.753718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.754232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.754725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.419 [2024-07-23 17:34:19.755225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.755777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.755798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.758289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.759678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.761419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.761921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.763045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.763113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.763603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.763672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.764155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.764176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.766114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.766616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.767128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.767619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.768715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.768782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.769285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.769352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.769775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.769795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.771662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.772172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.772668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.773165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.774259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.774328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.774819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.774878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.775290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.775311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.777333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.777401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.777890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.777955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.779071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.779137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.779622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.779677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.780183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.780209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.781698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.781761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.781814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.781867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.782993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.783063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.783554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.783612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.784101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.784122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.785741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.785808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.785862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.785900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.785921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.787035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.787128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.787636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.787695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.788204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.788226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.789882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.789954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.790007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.790059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.791057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.791126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.791613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.791668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.792190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.792211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.793637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.793698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.793752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.793805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.794887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.794958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.795448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.795506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.795992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.796013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.797610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.797673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.797727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.797778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.798417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.798507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.798564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.798616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.420 [2024-07-23 17:34:19.799020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.799042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.800553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.800615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.800667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.800721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.801393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.801485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.801555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.801618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.802082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.802108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.803550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.803611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.803662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.803716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.804384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.804455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.804511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.804580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.805027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.805047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.806565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.806627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.806679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.806735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.807406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.807464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.807529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.807583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.808040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.808060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.809545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.809608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.809660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.809712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.810393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.810452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.810504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.810560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.811021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.811047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.812548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.812611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.812679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.812731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.813461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.813523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.813576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.813630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.814051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.814071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.815636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.815699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.815752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.815804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.816542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.816600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.816654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.816706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.817075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.817096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.818598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.818660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.818713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.818770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.819475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.819534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.819587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.819640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.820033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.820054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.821472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.821538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.821591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.821644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.822299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.822360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.822413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.822466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.822944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.822964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.824344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.824407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.824460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.824514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.825160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.825220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.825273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.825325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.825747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.825768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.827247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.421 [2024-07-23 17:34:19.827314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.827367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.827420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.828084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.828144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.828197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.828250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.828680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.828701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.831231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.831306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.831359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.831417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.831956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.832015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.832068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.832118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.832613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.832633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.835320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.835388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.835441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.422 [2024-07-23 17:34:19.835495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.077920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.078428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.078491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.079440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.085224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.087000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.087595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.088095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.088764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.089674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.089740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.091270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.093034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.094787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.095285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.095305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.100230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.100739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.101608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.681 [2024-07-23 17:34:20.103162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.105333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.106217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.107770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.109527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.109843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.109864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.114016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.115776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.117543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.118429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.120606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.122373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.123324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.123811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.124348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.124369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.128740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.130502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.132259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.134022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.135027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.135096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.135585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.135640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.136148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.136169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.138807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.140393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.140907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.141397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.142575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.142652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.143155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.143215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.143763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.143784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.145832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.147117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.148299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.149133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.150286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.150354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.151299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.151356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.151716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.151736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.155234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.155745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.156244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.156879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.158582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.159963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.160023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.161818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.162310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.162331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.167214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.167918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.169213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.169273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.170846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.170920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.171704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.173499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.173987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.174008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.178278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.180031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.180091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.180744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.181276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.182021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.945 [2024-07-23 17:34:20.182516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.182575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.183072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.183093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.186175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.186680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.186747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.187256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.188317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.188820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.188890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.189395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.189906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.189928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.192910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.193422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.193481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.193982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.195105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.195174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.195661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.196161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.196579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.196600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.199782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.200296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.200359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.200856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.201580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.202099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.202599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.202657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.203093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.203116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.206102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.206610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.207112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.207605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.208624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.209138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.209201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.209691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.210232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.210252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.213263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.213765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.213827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.214329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.215285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.215362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.215850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.215912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.216283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.216304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.219234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.219732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.219787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.220289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.224884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.226589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.226656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.228351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.233282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.234391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.234452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.235792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.242108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.242790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.242848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.244388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.250282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.252049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.252110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.253441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.260381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.261915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.261976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.263744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.270468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.272206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.272273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.273772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.278910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.280660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.280728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.282315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.287309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.288989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.289049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.946 [2024-07-23 17:34:20.290471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.295326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.296490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.296549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.298100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.305180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.306936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.307003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.308738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.315906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.316413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.316470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.316968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.322865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.323375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.323433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.324525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.331353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.331856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.331925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.333340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.338715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.339575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.339634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.339676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.345308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.345373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.345427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.345485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.352576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.352644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.352697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.352750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.359391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.359456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.359529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:24.947 [2024-07-23 17:34:20.359583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.364781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.364846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.364916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.364970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.370429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.370494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.370546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.370598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.373983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.374046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.374098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.374154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.377819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.377888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.377950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.378009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.383627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.383693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.383746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.383807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.389413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.389479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.389536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.389592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.395462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.395528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.395580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.395637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.399194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.399258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.400817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.400874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.406309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.407706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.407765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.409400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.415625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.416396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.416456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.417499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.422353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.424107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.424167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.425185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.429159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.430931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.430990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.431876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.438923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.438991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.439045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.440120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.444943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.446282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.446341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.446832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.451549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.453361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.453421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.455175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.460761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.462534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.462594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.463483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.470517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.471321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.471379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.472925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.478616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.479142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.479201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.480487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.486724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.488263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.488327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.490148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.495494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.497105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.497164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.271 [2024-07-23 17:34:20.498910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.504680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.506451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.506510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.508262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.514955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.515461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.515516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.517158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.523456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.524126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.524186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.525329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.531992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.533004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.533064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.533860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.539297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.541020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.541087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.541928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.546954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.548236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.548295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.550060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.555097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.556606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.556663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.558489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.564015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.564604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.564660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.564705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.571207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.571285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.572774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.572833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.574150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.574468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.580055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.581475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.582530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.583483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.583954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.589541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.590162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.591917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.593070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.593436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.599713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.600222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.600714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.601215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.601774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.606520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.607036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.607533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.608029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.608518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.613862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.614375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.614868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.615365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.615838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.621264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.621777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.622275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.622776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.623290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.629108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.630272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.630780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.631276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.631596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.643802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.643872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.644786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.272 [2024-07-23 17:34:20.644844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.156155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.157361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.158921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.160673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.168183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.169660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.169711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.169763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.171512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.171855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.183269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.185033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.186772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.188017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.188334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.200119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.201215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.202192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.202250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.202567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.213664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.214937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.214998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.216635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.217145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.227825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.229630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.229698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.231450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.231767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.241504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.243295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.243359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.244481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.244799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.253636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.839 [2024-07-23 17:34:21.255153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.840 [2024-07-23 17:34:21.255215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.840 [2024-07-23 17:34:21.257000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:25.840 [2024-07-23 17:34:21.257325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.267029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.268806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.270166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.271980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.272513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.283021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.283528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.283589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.285130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.285450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.294593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.296220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.296281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.298027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.298346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.306222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.307812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.307871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.309487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.309874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.318041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.319538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.319598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.321328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.321648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.329230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.330988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.331047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.332785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.333302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.341362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.343171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.343237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.344992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.098 [2024-07-23 17:34:21.345310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.351926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.353450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.353509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.355183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.355555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.362672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.363573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.363634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.365185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.365503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.371760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.373563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.373623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.374979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.375297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.382735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.383252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.383321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.384175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.384558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.390844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.391355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.391423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.391922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.392418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.396838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.397376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.397432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.398845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.399274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.406630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.408138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.408201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.409917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.410399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.417068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.417576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.417632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.418778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.419224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.424742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.425263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.425326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.425812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.426255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.433394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.433467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.434179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.434241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.434673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.440753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.440833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.440890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.440949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.441426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.447388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.447459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.447511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.447574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.448071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.452026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.452091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.452143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.452195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.452509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.457962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.458038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.458105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.458171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.458631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.465731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.465796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.465854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.465914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.466336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.470536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.470602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.470657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.470740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.471217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.473113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.473177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.473229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.473280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.473826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.477252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.477343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.099 [2024-07-23 17:34:21.477404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.477456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.477954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.480798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.481319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.481814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.481873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.482256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.485192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.485258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.485311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.485374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.485868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.488440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.488505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.488562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.490181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.490629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.495179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.495243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.496987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.497045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.497360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.500624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.500695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.501656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.501716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.502040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.507143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.507214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.508974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.509032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.509512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.514597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.514674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.516471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.516531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.100 [2024-07-23 17:34:21.516850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.521701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.521775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.521836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.521887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.522209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.525437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.525507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.526024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.526079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.526398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.531231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.531303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.532852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.532917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.533233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.538488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.538560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.539517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.539578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.539939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.546810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.546889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.547390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.547467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.547994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.553381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.553452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.554299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.554357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.554743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.563746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.563817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.564797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.564855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.565219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.571633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.571704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.573453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.573510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.573961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.582366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.582437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.583599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.583657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.584020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.591881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.591961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.593690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.593746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.594254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.600870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.600952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.602265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.602345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.602870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.610611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.610692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.612477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.612541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.612857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.621816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.621887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.623161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.623237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.623561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.629623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.629691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.630874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.630942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.631259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.638038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.638108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.639906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.639964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.640281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.646624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.646703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.648272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.648328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.648699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.654228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.655995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.656060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.657740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.658201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.663572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.665332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.360 [2024-07-23 17:34:21.667078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.668827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.669317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.677351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.678264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.679984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.681734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.682058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.686535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.687812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.689350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.691091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.691409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.697362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.699089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.700854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.702330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.702812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.708936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.710706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.712464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.713147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.713609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.724330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.726076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.727785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.729561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.729881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.739164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.740929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.742732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.743886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.744251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.757046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.757116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.758530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.361 [2024-07-23 17:34:21.758587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.929 [2024-07-23 17:34:22.227625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.228170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.228682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.229188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.234713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.235238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.235293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.235346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.235419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.235941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.241032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.242114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.242636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.243835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.244194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.251922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.252447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.253248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.254981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.255398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.263577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.263663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.265078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.266881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.267204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.273060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.274599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.274658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.275161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.275479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.278992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.280745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.280805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.282140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.282513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.289592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.290849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.290916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.292461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.292778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.299340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.301101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.301162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.302258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.302782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.310155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.311914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.311982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.312472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.313006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.318870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.320437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.320502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.322241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.322559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.326934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.327436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.327494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.329049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.329368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.333794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.334309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.334369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.334856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.335243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.339655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.341227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.341286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.343028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.343346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.348279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.349461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:26.930 [2024-07-23 17:34:22.349521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.351052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.351535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.358278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.360037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.360096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.361249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.361595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.367982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.369720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.369785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.371512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.371887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.377454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.379215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.379284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.379775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.380099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.385323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.386483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.386544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.388098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.388416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.392477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.393104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.393165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.394688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.395023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.398210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.399626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.399686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.401346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.401665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.405326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.407140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.407200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.408875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.409255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.414408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.416151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.416211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.416712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.417039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.420887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.422636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.422701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.424106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.424491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.429680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.431216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.432809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.432867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.433381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.440418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.440489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.442230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.442286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.442602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.446973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.447038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.447090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.447143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.447614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.452650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.452714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.452766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.452817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.191 [2024-07-23 17:34:22.453144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.457089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.457153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.457208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.457264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.457580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.462052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.462119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.462172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.462249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.462782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.466035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.466100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.466157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.466210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.466569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.468542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.468607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.468661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.468713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.469038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.475507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.475571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.475624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.475675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.476167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.480841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.482307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.483461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.483963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.484289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.488603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.488680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.488747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.488818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.489274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.493157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.493222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.493274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.493339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.493654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.496069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.497205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.497268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.497323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.497825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.501361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.501432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.503223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.503290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.503765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.508561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.508632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.509156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.509216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.509596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.513876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.513952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.514621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.514682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.515004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.517799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.517869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.518909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.518967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.519283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.523263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.523341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.523837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.523915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.524424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.527097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.527168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.527660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.527718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.528073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.532412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.532482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.532988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.533048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.533364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.538280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.538352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.538846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.538911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.539381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.544312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.544384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.545087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.545159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.545477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.192 [2024-07-23 17:34:22.549005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.549075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.550815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.550871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.551285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.556989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.557061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.557553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.557609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.557974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.563541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.563612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.564115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.564175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.564650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.569160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.569231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.570006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.570065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.570520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.573856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.573934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.574428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.574486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.574925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.578183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.578259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.578755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.578814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.579287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.582459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.582532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.583410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.583470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.193 [2024-07-23 17:34:22.583856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:40:27.761 00:40:27.761 Latency(us) 00:40:27.761 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:27.761 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x0 length 0x100 00:40:27.761 crypto_ram : 5.81 44.06 2.75 0.00 0.00 2810975.72 53568.56 2771887.86 00:40:27.761 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x100 length 0x100 00:40:27.761 crypto_ram : 5.91 37.07 2.32 0.00 0.00 3245758.55 44450.50 3515920.92 00:40:27.761 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x0 length 0x100 00:40:27.761 crypto_ram2 : 5.81 44.56 2.78 0.00 0.00 2691287.05 26784.28 2771887.86 00:40:27.761 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x100 length 0x100 00:40:27.761 crypto_ram2 : 5.93 40.13 2.51 0.00 0.00 2946118.80 53340.61 3545098.69 00:40:27.761 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x0 length 0x100 00:40:27.761 crypto_ram3 : 5.57 298.26 18.64 0.00 0.00 384946.24 31229.33 514258.14 00:40:27.761 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x100 length 0x100 00:40:27.761 crypto_ram3 : 5.63 235.45 14.72 0.00 0.00 481195.74 35560.40 587202.56 00:40:27.761 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x0 length 0x100 00:40:27.761 crypto_ram4 : 5.68 313.65 19.60 0.00 0.00 355686.72 9118.05 439490.11 00:40:27.761 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:40:27.761 Verification LBA range: start 0x100 length 0x100 00:40:27.761 crypto_ram4 : 5.75 252.20 15.76 0.00 0.00 436139.58 23137.06 525199.81 00:40:27.761 =================================================================================================================== 00:40:27.761 Total : 1265.38 79.09 0.00 0.00 747191.85 9118.05 3545098.69 00:40:28.329 00:40:28.329 real 0m9.279s 00:40:28.329 user 0m17.470s 00:40:28.329 sys 0m0.629s 00:40:28.329 17:34:23 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:28.329 17:34:23 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:40:28.329 ************************************ 00:40:28.329 END TEST bdev_verify_big_io 00:40:28.329 ************************************ 00:40:28.329 17:34:23 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:40:28.329 17:34:23 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:40:28.329 17:34:23 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:40:28.329 17:34:23 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:28.329 17:34:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:40:28.329 ************************************ 00:40:28.329 START TEST bdev_write_zeroes 00:40:28.329 ************************************ 00:40:28.329 17:34:23 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:40:28.329 [2024-07-23 17:34:23.681713] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:28.329 [2024-07-23 17:34:23.681782] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133234 ] 00:40:28.588 [2024-07-23 17:34:23.816934] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:28.588 [2024-07-23 17:34:23.874536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:28.588 [2024-07-23 17:34:23.895902] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:40:28.588 [2024-07-23 17:34:23.903929] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:40:28.588 [2024-07-23 17:34:23.911963] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:40:28.847 [2024-07-23 17:34:24.025151] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:40:31.383 [2024-07-23 17:34:26.416524] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:40:31.383 [2024-07-23 17:34:26.416601] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:40:31.383 [2024-07-23 17:34:26.416618] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:31.383 [2024-07-23 17:34:26.424542] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:40:31.383 [2024-07-23 17:34:26.424561] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:40:31.383 [2024-07-23 17:34:26.424573] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:31.383 [2024-07-23 17:34:26.432561] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:40:31.383 [2024-07-23 17:34:26.432579] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:40:31.383 [2024-07-23 17:34:26.432590] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:31.383 [2024-07-23 17:34:26.440583] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:40:31.383 [2024-07-23 17:34:26.440600] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:40:31.383 [2024-07-23 17:34:26.440611] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:31.383 Running I/O for 1 seconds... 00:40:32.319 00:40:32.319 Latency(us) 00:40:32.319 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:40:32.319 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:40:32.319 crypto_ram : 1.02 1975.37 7.72 0.00 0.00 64261.33 5442.34 77503.44 00:40:32.319 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:40:32.319 crypto_ram2 : 1.03 1989.07 7.77 0.00 0.00 63535.91 5385.35 72032.61 00:40:32.319 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:40:32.319 crypto_ram3 : 1.02 15169.22 59.25 0.00 0.00 8296.16 2464.72 10827.69 00:40:32.319 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:40:32.319 crypto_ram4 : 1.02 15214.18 59.43 0.00 0.00 8244.64 1966.08 8662.15 00:40:32.319 =================================================================================================================== 00:40:32.319 Total : 34347.85 134.17 0.00 0.00 14710.28 1966.08 77503.44 00:40:32.577 00:40:32.577 real 0m4.305s 00:40:32.577 user 0m3.713s 00:40:32.577 sys 0m0.547s 00:40:32.577 17:34:27 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:32.577 17:34:27 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:40:32.577 ************************************ 00:40:32.577 END TEST bdev_write_zeroes 00:40:32.577 ************************************ 00:40:32.577 17:34:27 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:40:32.577 17:34:27 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:40:32.577 17:34:27 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:40:32.577 17:34:27 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:32.577 17:34:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:40:32.836 ************************************ 00:40:32.836 START TEST bdev_json_nonenclosed 00:40:32.836 ************************************ 00:40:32.836 17:34:28 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:40:32.836 [2024-07-23 17:34:28.071531] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:32.836 [2024-07-23 17:34:28.071592] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133775 ] 00:40:32.836 [2024-07-23 17:34:28.203680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:32.836 [2024-07-23 17:34:28.253016] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:32.836 [2024-07-23 17:34:28.253089] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:40:32.836 [2024-07-23 17:34:28.253107] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:40:32.836 [2024-07-23 17:34:28.253119] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:40:33.095 00:40:33.095 real 0m0.330s 00:40:33.095 user 0m0.193s 00:40:33.095 sys 0m0.135s 00:40:33.095 17:34:28 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:40:33.095 17:34:28 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:33.095 17:34:28 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:40:33.095 ************************************ 00:40:33.095 END TEST bdev_json_nonenclosed 00:40:33.095 ************************************ 00:40:33.095 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:40:33.095 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # true 00:40:33.095 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:40:33.095 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:40:33.095 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:33.095 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:40:33.095 ************************************ 00:40:33.095 START TEST bdev_json_nonarray 00:40:33.095 ************************************ 00:40:33.095 17:34:28 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:40:33.095 [2024-07-23 17:34:28.485582] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:33.095 [2024-07-23 17:34:28.485651] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid133884 ] 00:40:33.354 [2024-07-23 17:34:28.617747] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:33.354 [2024-07-23 17:34:28.671174] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:33.354 [2024-07-23 17:34:28.671249] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:40:33.354 [2024-07-23 17:34:28.671267] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:40:33.354 [2024-07-23 17:34:28.671280] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:40:33.354 00:40:33.354 real 0m0.339s 00:40:33.354 user 0m0.175s 00:40:33.354 sys 0m0.162s 00:40:33.354 17:34:28 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:40:33.354 17:34:28 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:33.354 17:34:28 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:40:33.354 ************************************ 00:40:33.354 END TEST bdev_json_nonarray 00:40:33.354 ************************************ 00:40:33.613 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # true 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:40:33.613 17:34:28 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:40:33.613 00:40:33.613 real 1m14.865s 00:40:33.613 user 2m43.104s 00:40:33.613 sys 0m11.332s 00:40:33.613 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:33.613 17:34:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:40:33.613 ************************************ 00:40:33.613 END TEST blockdev_crypto_aesni 00:40:33.613 ************************************ 00:40:33.613 17:34:28 -- common/autotest_common.sh@1142 -- # return 0 00:40:33.613 17:34:28 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:40:33.613 17:34:28 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:40:33.613 17:34:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:33.613 17:34:28 -- common/autotest_common.sh@10 -- # set +x 00:40:33.613 ************************************ 00:40:33.613 START TEST blockdev_crypto_sw 00:40:33.613 ************************************ 00:40:33.613 17:34:28 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:40:33.613 * Looking for test storage... 00:40:33.613 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:40:33.613 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=134027 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:40:33.872 17:34:29 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 134027 00:40:33.872 17:34:29 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 134027 ']' 00:40:33.872 17:34:29 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:40:33.872 17:34:29 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:40:33.872 17:34:29 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:40:33.872 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:40:33.872 17:34:29 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:40:33.872 17:34:29 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:33.872 [2024-07-23 17:34:29.111221] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:33.872 [2024-07-23 17:34:29.111298] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134027 ] 00:40:33.872 [2024-07-23 17:34:29.243584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:34.130 [2024-07-23 17:34:29.296234] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:34.697 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:40:34.697 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:40:34.697 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:40:34.697 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:40:34.697 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:40:34.697 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:40:34.697 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:34.955 Malloc0 00:40:34.955 Malloc1 00:40:34.955 true 00:40:34.955 true 00:40:34.955 true 00:40:34.955 [2024-07-23 17:34:30.291912] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:40:34.955 crypto_ram 00:40:34.955 [2024-07-23 17:34:30.299941] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:40:34.955 crypto_ram2 00:40:34.955 [2024-07-23 17:34:30.307957] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:40:34.955 crypto_ram3 00:40:34.955 [ 00:40:34.955 { 00:40:34.955 "name": "Malloc1", 00:40:34.955 "aliases": [ 00:40:34.955 "56bc9795-8ded-4796-ab5a-0bad328179d9" 00:40:34.955 ], 00:40:34.955 "product_name": "Malloc disk", 00:40:34.955 "block_size": 4096, 00:40:34.955 "num_blocks": 4096, 00:40:34.955 "uuid": "56bc9795-8ded-4796-ab5a-0bad328179d9", 00:40:34.955 "assigned_rate_limits": { 00:40:34.955 "rw_ios_per_sec": 0, 00:40:34.955 "rw_mbytes_per_sec": 0, 00:40:34.955 "r_mbytes_per_sec": 0, 00:40:34.955 "w_mbytes_per_sec": 0 00:40:34.955 }, 00:40:34.955 "claimed": true, 00:40:34.955 "claim_type": "exclusive_write", 00:40:34.955 "zoned": false, 00:40:34.955 "supported_io_types": { 00:40:34.955 "read": true, 00:40:34.955 "write": true, 00:40:34.955 "unmap": true, 00:40:34.955 "flush": true, 00:40:34.955 "reset": true, 00:40:34.955 "nvme_admin": false, 00:40:34.955 "nvme_io": false, 00:40:34.955 "nvme_io_md": false, 00:40:34.955 "write_zeroes": true, 00:40:34.955 "zcopy": true, 00:40:34.955 "get_zone_info": false, 00:40:34.955 "zone_management": false, 00:40:34.955 "zone_append": false, 00:40:34.955 "compare": false, 00:40:34.955 "compare_and_write": false, 00:40:34.955 "abort": true, 00:40:34.955 "seek_hole": false, 00:40:34.955 "seek_data": false, 00:40:34.955 "copy": true, 00:40:34.955 "nvme_iov_md": false 00:40:34.955 }, 00:40:34.955 "memory_domains": [ 00:40:34.955 { 00:40:34.955 "dma_device_id": "system", 00:40:34.955 "dma_device_type": 1 00:40:34.955 }, 00:40:34.955 { 00:40:34.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:40:34.955 "dma_device_type": 2 00:40:34.955 } 00:40:34.955 ], 00:40:34.955 "driver_specific": {} 00:40:34.955 } 00:40:34.955 ] 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:40:34.955 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:40:34.955 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:40:34.955 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:40:34.955 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:40:34.955 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa30e7f5-7e0b-57af-9142-ccd71374903d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "aa30e7f5-7e0b-57af-9142-ccd71374903d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ac48648b-c39f-518c-b0b1-d8b4ddada2bd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ac48648b-c39f-518c-b0b1-d8b4ddada2bd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:40:35.214 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 134027 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 134027 ']' 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 134027 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 134027 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 134027' 00:40:35.214 killing process with pid 134027 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 134027 00:40:35.214 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 134027 00:40:35.782 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:40:35.782 17:34:30 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:40:35.782 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:40:35.782 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:35.782 17:34:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:35.782 ************************************ 00:40:35.782 START TEST bdev_hello_world 00:40:35.782 ************************************ 00:40:35.782 17:34:31 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:40:35.782 [2024-07-23 17:34:31.063899] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:35.782 [2024-07-23 17:34:31.063962] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134236 ] 00:40:35.782 [2024-07-23 17:34:31.193505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:36.041 [2024-07-23 17:34:31.247554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:36.041 [2024-07-23 17:34:31.426227] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:40:36.041 [2024-07-23 17:34:31.426296] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:40:36.041 [2024-07-23 17:34:31.426311] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:36.041 [2024-07-23 17:34:31.434245] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:40:36.041 [2024-07-23 17:34:31.434264] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:40:36.041 [2024-07-23 17:34:31.434275] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:36.041 [2024-07-23 17:34:31.442267] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:40:36.041 [2024-07-23 17:34:31.442284] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:40:36.041 [2024-07-23 17:34:31.442296] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:36.300 [2024-07-23 17:34:31.483896] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:40:36.300 [2024-07-23 17:34:31.483932] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:40:36.300 [2024-07-23 17:34:31.483951] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:40:36.300 [2024-07-23 17:34:31.485205] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:40:36.300 [2024-07-23 17:34:31.485287] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:40:36.300 [2024-07-23 17:34:31.485303] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:40:36.300 [2024-07-23 17:34:31.485337] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:40:36.300 00:40:36.300 [2024-07-23 17:34:31.485354] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:40:36.300 00:40:36.300 real 0m0.691s 00:40:36.300 user 0m0.433s 00:40:36.300 sys 0m0.237s 00:40:36.300 17:34:31 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:36.300 17:34:31 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:40:36.300 ************************************ 00:40:36.300 END TEST bdev_hello_world 00:40:36.300 ************************************ 00:40:36.560 17:34:31 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:40:36.560 17:34:31 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:40:36.560 17:34:31 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:40:36.560 17:34:31 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:36.560 17:34:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:36.560 ************************************ 00:40:36.560 START TEST bdev_bounds 00:40:36.560 ************************************ 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=134420 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 134420' 00:40:36.560 Process bdevio pid: 134420 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 134420 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 134420 ']' 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:40:36.560 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:40:36.560 17:34:31 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:40:36.560 [2024-07-23 17:34:31.845271] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:36.560 [2024-07-23 17:34:31.845344] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid134420 ] 00:40:36.560 [2024-07-23 17:34:31.980653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:40:36.819 [2024-07-23 17:34:32.037987] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:40:36.819 [2024-07-23 17:34:32.038078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:40:36.819 [2024-07-23 17:34:32.038080] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:36.819 [2024-07-23 17:34:32.207772] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:40:36.819 [2024-07-23 17:34:32.207841] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:40:36.819 [2024-07-23 17:34:32.207856] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:36.819 [2024-07-23 17:34:32.215791] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:40:36.819 [2024-07-23 17:34:32.215811] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:40:36.819 [2024-07-23 17:34:32.215822] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:36.819 [2024-07-23 17:34:32.223816] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:40:36.819 [2024-07-23 17:34:32.223835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:40:36.819 [2024-07-23 17:34:32.223846] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:37.387 17:34:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:40:37.387 17:34:32 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:40:37.387 17:34:32 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:40:37.647 I/O targets: 00:40:37.647 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:40:37.647 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:40:37.647 00:40:37.647 00:40:37.647 CUnit - A unit testing framework for C - Version 2.1-3 00:40:37.647 http://cunit.sourceforge.net/ 00:40:37.647 00:40:37.647 00:40:37.647 Suite: bdevio tests on: crypto_ram3 00:40:37.647 Test: blockdev write read block ...passed 00:40:37.647 Test: blockdev write zeroes read block ...passed 00:40:37.647 Test: blockdev write zeroes read no split ...passed 00:40:37.647 Test: blockdev write zeroes read split ...passed 00:40:37.647 Test: blockdev write zeroes read split partial ...passed 00:40:37.647 Test: blockdev reset ...passed 00:40:37.647 Test: blockdev write read 8 blocks ...passed 00:40:37.647 Test: blockdev write read size > 128k ...passed 00:40:37.647 Test: blockdev write read invalid size ...passed 00:40:37.647 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:40:37.647 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:40:37.647 Test: blockdev write read max offset ...passed 00:40:37.647 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:40:37.647 Test: blockdev writev readv 8 blocks ...passed 00:40:37.647 Test: blockdev writev readv 30 x 1block ...passed 00:40:37.647 Test: blockdev writev readv block ...passed 00:40:37.647 Test: blockdev writev readv size > 128k ...passed 00:40:37.647 Test: blockdev writev readv size > 128k in two iovs ...passed 00:40:37.647 Test: blockdev comparev and writev ...passed 00:40:37.647 Test: blockdev nvme passthru rw ...passed 00:40:37.647 Test: blockdev nvme passthru vendor specific ...passed 00:40:37.647 Test: blockdev nvme admin passthru ...passed 00:40:37.647 Test: blockdev copy ...passed 00:40:37.647 Suite: bdevio tests on: crypto_ram 00:40:37.647 Test: blockdev write read block ...passed 00:40:37.647 Test: blockdev write zeroes read block ...passed 00:40:37.647 Test: blockdev write zeroes read no split ...passed 00:40:37.647 Test: blockdev write zeroes read split ...passed 00:40:37.647 Test: blockdev write zeroes read split partial ...passed 00:40:37.647 Test: blockdev reset ...passed 00:40:37.647 Test: blockdev write read 8 blocks ...passed 00:40:37.647 Test: blockdev write read size > 128k ...passed 00:40:37.647 Test: blockdev write read invalid size ...passed 00:40:37.647 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:40:37.647 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:40:37.647 Test: blockdev write read max offset ...passed 00:40:37.647 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:40:37.647 Test: blockdev writev readv 8 blocks ...passed 00:40:37.647 Test: blockdev writev readv 30 x 1block ...passed 00:40:37.647 Test: blockdev writev readv block ...passed 00:40:37.647 Test: blockdev writev readv size > 128k ...passed 00:40:37.647 Test: blockdev writev readv size > 128k in two iovs ...passed 00:40:37.647 Test: blockdev comparev and writev ...passed 00:40:37.647 Test: blockdev nvme passthru rw ...passed 00:40:37.647 Test: blockdev nvme passthru vendor specific ...passed 00:40:37.648 Test: blockdev nvme admin passthru ...passed 00:40:37.648 Test: blockdev copy ...passed 00:40:37.648 00:40:37.648 Run Summary: Type Total Ran Passed Failed Inactive 00:40:37.648 suites 2 2 n/a 0 0 00:40:37.648 tests 46 46 46 0 0 00:40:37.648 asserts 260 260 260 0 n/a 00:40:37.648 00:40:37.648 Elapsed time = 0.194 seconds 00:40:37.648 0 00:40:37.648 17:34:33 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 134420 00:40:37.648 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 134420 ']' 00:40:37.648 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 134420 00:40:37.648 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:40:37.648 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:40:37.648 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 134420 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 134420' 00:40:37.907 killing process with pid 134420 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 134420 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 134420 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:40:37.907 00:40:37.907 real 0m1.518s 00:40:37.907 user 0m3.995s 00:40:37.907 sys 0m0.408s 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:37.907 17:34:33 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:40:37.907 ************************************ 00:40:37.907 END TEST bdev_bounds 00:40:37.907 ************************************ 00:40:38.166 17:34:33 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:40:38.166 17:34:33 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:40:38.166 17:34:33 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:40:38.166 17:34:33 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:38.166 17:34:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:38.166 ************************************ 00:40:38.166 START TEST bdev_nbd 00:40:38.166 ************************************ 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=134625 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 134625 /var/tmp/spdk-nbd.sock 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 134625 ']' 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:40:38.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:40:38.166 17:34:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:40:38.166 [2024-07-23 17:34:33.457729] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:40:38.166 [2024-07-23 17:34:33.457794] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:40:38.425 [2024-07-23 17:34:33.589867] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:40:38.425 [2024-07-23 17:34:33.642061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:40:38.425 [2024-07-23 17:34:33.805034] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:40:38.425 [2024-07-23 17:34:33.805100] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:40:38.425 [2024-07-23 17:34:33.805114] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:38.425 [2024-07-23 17:34:33.813053] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:40:38.425 [2024-07-23 17:34:33.813072] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:40:38.425 [2024-07-23 17:34:33.813084] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:38.425 [2024-07-23 17:34:33.821074] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:40:38.425 [2024-07-23 17:34:33.821093] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:40:38.425 [2024-07-23 17:34:33.821105] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:40:38.994 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:39.253 1+0 records in 00:40:39.253 1+0 records out 00:40:39.253 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266122 s, 15.4 MB/s 00:40:39.253 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:40:39.512 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:40:39.771 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:39.771 1+0 records in 00:40:39.771 1+0 records out 00:40:39.772 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000361746 s, 11.3 MB/s 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:40:39.772 17:34:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:40:40.031 { 00:40:40.031 "nbd_device": "/dev/nbd0", 00:40:40.031 "bdev_name": "crypto_ram" 00:40:40.031 }, 00:40:40.031 { 00:40:40.031 "nbd_device": "/dev/nbd1", 00:40:40.031 "bdev_name": "crypto_ram3" 00:40:40.031 } 00:40:40.031 ]' 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:40:40.031 { 00:40:40.031 "nbd_device": "/dev/nbd0", 00:40:40.031 "bdev_name": "crypto_ram" 00:40:40.031 }, 00:40:40.031 { 00:40:40.031 "nbd_device": "/dev/nbd1", 00:40:40.031 "bdev_name": "crypto_ram3" 00:40:40.031 } 00:40:40.031 ]' 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:40.031 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:40.292 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:40.601 17:34:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:40:40.860 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:40:41.120 /dev/nbd0 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:41.120 1+0 records in 00:40:41.120 1+0 records out 00:40:41.120 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271486 s, 15.1 MB/s 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:40:41.120 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:40:41.379 /dev/nbd1 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:40:41.379 1+0 records in 00:40:41.379 1+0 records out 00:40:41.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000351958 s, 11.6 MB/s 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:41.379 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:40:41.639 { 00:40:41.639 "nbd_device": "/dev/nbd0", 00:40:41.639 "bdev_name": "crypto_ram" 00:40:41.639 }, 00:40:41.639 { 00:40:41.639 "nbd_device": "/dev/nbd1", 00:40:41.639 "bdev_name": "crypto_ram3" 00:40:41.639 } 00:40:41.639 ]' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:40:41.639 { 00:40:41.639 "nbd_device": "/dev/nbd0", 00:40:41.639 "bdev_name": "crypto_ram" 00:40:41.639 }, 00:40:41.639 { 00:40:41.639 "nbd_device": "/dev/nbd1", 00:40:41.639 "bdev_name": "crypto_ram3" 00:40:41.639 } 00:40:41.639 ]' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:40:41.639 /dev/nbd1' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:40:41.639 /dev/nbd1' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:40:41.639 256+0 records in 00:40:41.639 256+0 records out 00:40:41.639 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106371 s, 98.6 MB/s 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:40:41.639 17:34:36 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:40:41.639 256+0 records in 00:40:41.639 256+0 records out 00:40:41.639 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0302864 s, 34.6 MB/s 00:40:41.639 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:40:41.639 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:40:41.899 256+0 records in 00:40:41.899 256+0 records out 00:40:41.899 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0472839 s, 22.2 MB/s 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:41.899 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:42.158 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:40:42.417 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:40:42.676 17:34:37 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:40:42.935 malloc_lvol_verify 00:40:42.935 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:40:43.195 f5fa850c-aed3-40f5-986e-3aa75e9a8b52 00:40:43.195 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:40:43.195 c3e58a6e-3a92-4c9c-b214-7d4c8e143a3a 00:40:43.454 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:40:43.454 /dev/nbd0 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:40:43.713 mke2fs 1.46.5 (30-Dec-2021) 00:40:43.713 Discarding device blocks: 0/4096 done 00:40:43.713 Creating filesystem with 4096 1k blocks and 1024 inodes 00:40:43.713 00:40:43.713 Allocating group tables: 0/1 done 00:40:43.713 Writing inode tables: 0/1 done 00:40:43.713 Creating journal (1024 blocks): done 00:40:43.713 Writing superblocks and filesystem accounting information: 0/1 done 00:40:43.713 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:40:43.713 17:34:38 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 134625 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 134625 ']' 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 134625 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 134625 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 134625' 00:40:43.972 killing process with pid 134625 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 134625 00:40:43.972 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 134625 00:40:44.232 17:34:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:40:44.232 00:40:44.232 real 0m6.043s 00:40:44.232 user 0m8.584s 00:40:44.232 sys 0m2.496s 00:40:44.232 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:44.232 17:34:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:40:44.232 ************************************ 00:40:44.232 END TEST bdev_nbd 00:40:44.232 ************************************ 00:40:44.232 17:34:39 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:40:44.232 17:34:39 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:40:44.232 17:34:39 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:40:44.232 17:34:39 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:40:44.232 17:34:39 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:40:44.232 17:34:39 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:40:44.232 17:34:39 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:44.232 17:34:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:40:44.232 ************************************ 00:40:44.232 START TEST bdev_fio 00:40:44.232 ************************************ 00:40:44.232 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:40:44.232 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:40:44.232 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:40:44.233 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:40:44.233 ************************************ 00:40:44.233 START TEST bdev_fio_rw_verify 00:40:44.233 ************************************ 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:40:44.233 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:40:44.492 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:40:44.492 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:40:44.492 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:40:44.493 17:34:39 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:44.752 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:40:44.752 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:40:44.752 fio-3.35 00:40:44.752 Starting 2 threads 00:40:56.967 00:40:56.967 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=135732: Tue Jul 23 17:34:50 2024 00:40:56.967 read: IOPS=26.4k, BW=103MiB/s (108MB/s)(1032MiB/10000msec) 00:40:56.967 slat (nsec): min=9096, max=91886, avg=17804.77, stdev=8384.70 00:40:56.967 clat (usec): min=4, max=506, avg=119.31, stdev=56.01 00:40:56.967 lat (usec): min=17, max=542, avg=137.11, stdev=60.75 00:40:56.967 clat percentiles (usec): 00:40:56.967 | 50.000th=[ 112], 99.000th=[ 281], 99.900th=[ 359], 99.990th=[ 429], 00:40:56.967 | 99.999th=[ 494] 00:40:56.967 write: IOPS=31.7k, BW=124MiB/s (130MB/s)(1175MiB/9477msec); 0 zone resets 00:40:56.967 slat (usec): min=9, max=845, avg=28.26, stdev=10.06 00:40:56.967 clat (usec): min=16, max=1196, avg=163.25, stdev=86.79 00:40:56.967 lat (usec): min=34, max=1244, avg=191.52, stdev=92.07 00:40:56.967 clat percentiles (usec): 00:40:56.967 | 50.000th=[ 151], 99.000th=[ 420], 99.900th=[ 523], 99.990th=[ 766], 00:40:56.967 | 99.999th=[ 1090] 00:40:56.967 bw ( KiB/s): min=80384, max=147536, per=93.96%, avg=119244.63, stdev=13744.83, samples=38 00:40:56.967 iops : min=20096, max=36884, avg=29811.16, stdev=3436.21, samples=38 00:40:56.968 lat (usec) : 10=0.01%, 20=0.01%, 50=6.51%, 100=26.64%, 250=57.44% 00:40:56.968 lat (usec) : 500=9.30%, 750=0.09%, 1000=0.01% 00:40:56.968 lat (msec) : 2=0.01% 00:40:56.968 cpu : usr=99.63%, sys=0.01%, ctx=41, majf=0, minf=606 00:40:56.968 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:40:56.968 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:40:56.968 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:40:56.968 issued rwts: total=264264,300684,0,0 short=0,0,0,0 dropped=0,0,0,0 00:40:56.968 latency : target=0, window=0, percentile=100.00%, depth=8 00:40:56.968 00:40:56.968 Run status group 0 (all jobs): 00:40:56.968 READ: bw=103MiB/s (108MB/s), 103MiB/s-103MiB/s (108MB/s-108MB/s), io=1032MiB (1082MB), run=10000-10000msec 00:40:56.968 WRITE: bw=124MiB/s (130MB/s), 124MiB/s-124MiB/s (130MB/s-130MB/s), io=1175MiB (1232MB), run=9477-9477msec 00:40:56.968 00:40:56.968 real 0m11.157s 00:40:56.968 user 0m23.558s 00:40:56.968 sys 0m0.399s 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:40:56.968 ************************************ 00:40:56.968 END TEST bdev_fio_rw_verify 00:40:56.968 ************************************ 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa30e7f5-7e0b-57af-9142-ccd71374903d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "aa30e7f5-7e0b-57af-9142-ccd71374903d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ac48648b-c39f-518c-b0b1-d8b4ddada2bd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ac48648b-c39f-518c-b0b1-d8b4ddada2bd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:40:56.968 crypto_ram3 ]] 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa30e7f5-7e0b-57af-9142-ccd71374903d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "aa30e7f5-7e0b-57af-9142-ccd71374903d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ac48648b-c39f-518c-b0b1-d8b4ddada2bd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ac48648b-c39f-518c-b0b1-d8b4ddada2bd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:40:56.968 17:34:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:40:56.968 ************************************ 00:40:56.968 START TEST bdev_fio_trim 00:40:56.968 ************************************ 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:40:56.968 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:40:56.969 17:34:51 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:40:56.969 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:40:56.969 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:40:56.969 fio-3.35 00:40:56.969 Starting 2 threads 00:41:06.957 00:41:06.957 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=137239: Tue Jul 23 17:35:01 2024 00:41:06.957 write: IOPS=38.6k, BW=151MiB/s (158MB/s)(1508MiB/10001msec); 0 zone resets 00:41:06.957 slat (usec): min=14, max=576, avg=22.23, stdev= 4.26 00:41:06.957 clat (usec): min=40, max=887, avg=170.29, stdev=94.67 00:41:06.957 lat (usec): min=55, max=922, avg=192.52, stdev=97.89 00:41:06.957 clat percentiles (usec): 00:41:06.957 | 50.000th=[ 135], 99.000th=[ 351], 99.900th=[ 371], 99.990th=[ 570], 00:41:06.957 | 99.999th=[ 775] 00:41:06.957 bw ( KiB/s): min=151080, max=156112, per=100.00%, avg=154591.58, stdev=630.11, samples=38 00:41:06.957 iops : min=37770, max=39028, avg=38647.89, stdev=157.53, samples=38 00:41:06.957 trim: IOPS=38.6k, BW=151MiB/s (158MB/s)(1508MiB/10001msec); 0 zone resets 00:41:06.957 slat (usec): min=6, max=1978, avg=10.88, stdev= 4.06 00:41:06.957 clat (usec): min=45, max=601, avg=113.12, stdev=34.06 00:41:06.957 lat (usec): min=54, max=2238, avg=124.00, stdev=34.35 00:41:06.957 clat percentiles (usec): 00:41:06.957 | 50.000th=[ 114], 99.000th=[ 184], 99.900th=[ 196], 99.990th=[ 289], 00:41:06.957 | 99.999th=[ 469] 00:41:06.957 bw ( KiB/s): min=151080, max=156112, per=100.00%, avg=154593.26, stdev=629.15, samples=38 00:41:06.957 iops : min=37770, max=39028, avg=38648.32, stdev=157.29, samples=38 00:41:06.957 lat (usec) : 50=3.24%, 100=32.39%, 250=49.73%, 500=14.64%, 750=0.01% 00:41:06.957 lat (usec) : 1000=0.01% 00:41:06.957 cpu : usr=99.55%, sys=0.00%, ctx=28, majf=0, minf=288 00:41:06.957 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:41:06.957 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:41:06.957 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:41:06.957 issued rwts: total=0,386123,386123,0 short=0,0,0,0 dropped=0,0,0,0 00:41:06.957 latency : target=0, window=0, percentile=100.00%, depth=8 00:41:06.957 00:41:06.957 Run status group 0 (all jobs): 00:41:06.957 WRITE: bw=151MiB/s (158MB/s), 151MiB/s-151MiB/s (158MB/s-158MB/s), io=1508MiB (1582MB), run=10001-10001msec 00:41:06.957 TRIM: bw=151MiB/s (158MB/s), 151MiB/s-151MiB/s (158MB/s-158MB/s), io=1508MiB (1582MB), run=10001-10001msec 00:41:06.957 00:41:06.957 real 0m11.205s 00:41:06.957 user 0m23.949s 00:41:06.957 sys 0m0.348s 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:41:06.957 ************************************ 00:41:06.957 END TEST bdev_fio_trim 00:41:06.957 ************************************ 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:41:06.957 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:41:06.957 00:41:06.957 real 0m22.758s 00:41:06.957 user 0m47.710s 00:41:06.957 sys 0m0.959s 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:41:06.957 ************************************ 00:41:06.957 END TEST bdev_fio 00:41:06.957 ************************************ 00:41:06.957 17:35:02 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:41:06.957 17:35:02 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:41:06.957 17:35:02 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:41:06.957 17:35:02 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:41:06.957 17:35:02 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:06.957 17:35:02 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:06.957 ************************************ 00:41:06.957 START TEST bdev_verify 00:41:06.957 ************************************ 00:41:06.957 17:35:02 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:41:07.216 [2024-07-23 17:35:02.418850] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:07.216 [2024-07-23 17:35:02.418931] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138653 ] 00:41:07.216 [2024-07-23 17:35:02.551290] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:41:07.216 [2024-07-23 17:35:02.606950] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:41:07.216 [2024-07-23 17:35:02.606955] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:07.475 [2024-07-23 17:35:02.777742] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:41:07.475 [2024-07-23 17:35:02.777813] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:41:07.475 [2024-07-23 17:35:02.777829] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:07.475 [2024-07-23 17:35:02.785761] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:41:07.475 [2024-07-23 17:35:02.785781] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:41:07.475 [2024-07-23 17:35:02.785792] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:07.475 [2024-07-23 17:35:02.793786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:41:07.475 [2024-07-23 17:35:02.793805] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:41:07.475 [2024-07-23 17:35:02.793817] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:07.475 Running I/O for 5 seconds... 00:41:12.747 00:41:12.747 Latency(us) 00:41:12.747 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:12.747 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:41:12.747 Verification LBA range: start 0x0 length 0x800 00:41:12.747 crypto_ram : 5.01 6103.47 23.84 0.00 0.00 20893.45 1702.51 22795.13 00:41:12.747 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:41:12.747 Verification LBA range: start 0x800 length 0x800 00:41:12.747 crypto_ram : 5.03 4913.47 19.19 0.00 0.00 25945.31 1923.34 26214.40 00:41:12.747 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:41:12.747 Verification LBA range: start 0x0 length 0x800 00:41:12.747 crypto_ram3 : 5.02 3060.14 11.95 0.00 0.00 41611.97 1937.59 27468.13 00:41:12.747 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:41:12.747 Verification LBA range: start 0x800 length 0x800 00:41:12.747 crypto_ram3 : 5.04 2465.54 9.63 0.00 0.00 51600.55 2222.53 32824.99 00:41:12.747 =================================================================================================================== 00:41:12.747 Total : 16542.63 64.62 0.00 0.00 30816.13 1702.51 32824.99 00:41:12.747 00:41:12.747 real 0m5.768s 00:41:12.747 user 0m10.868s 00:41:12.747 sys 0m0.251s 00:41:12.747 17:35:08 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:12.747 17:35:08 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:41:12.747 ************************************ 00:41:12.747 END TEST bdev_verify 00:41:12.747 ************************************ 00:41:13.055 17:35:08 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:41:13.055 17:35:08 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:41:13.055 17:35:08 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:41:13.055 17:35:08 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:13.055 17:35:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:13.055 ************************************ 00:41:13.055 START TEST bdev_verify_big_io 00:41:13.055 ************************************ 00:41:13.055 17:35:08 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:41:13.055 [2024-07-23 17:35:08.277050] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:13.055 [2024-07-23 17:35:08.277114] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139366 ] 00:41:13.055 [2024-07-23 17:35:08.412124] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:41:13.331 [2024-07-23 17:35:08.471543] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:41:13.331 [2024-07-23 17:35:08.471548] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:13.331 [2024-07-23 17:35:08.639786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:41:13.331 [2024-07-23 17:35:08.639856] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:41:13.331 [2024-07-23 17:35:08.639872] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:13.331 [2024-07-23 17:35:08.647808] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:41:13.331 [2024-07-23 17:35:08.647829] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:41:13.331 [2024-07-23 17:35:08.647841] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:13.331 [2024-07-23 17:35:08.655831] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:41:13.331 [2024-07-23 17:35:08.655852] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:41:13.331 [2024-07-23 17:35:08.655863] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:13.331 Running I/O for 5 seconds... 00:41:19.899 00:41:19.899 Latency(us) 00:41:19.899 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:19.899 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:41:19.899 Verification LBA range: start 0x0 length 0x80 00:41:19.899 crypto_ram : 5.22 465.84 29.12 0.00 0.00 268315.35 6354.14 381134.58 00:41:19.899 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:41:19.899 Verification LBA range: start 0x80 length 0x80 00:41:19.899 crypto_ram : 5.20 369.39 23.09 0.00 0.00 336992.43 7522.39 437666.50 00:41:19.899 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:41:19.899 Verification LBA range: start 0x0 length 0x80 00:41:19.899 crypto_ram3 : 5.23 244.55 15.28 0.00 0.00 494594.30 5698.78 397547.07 00:41:19.899 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:41:19.899 Verification LBA range: start 0x80 length 0x80 00:41:19.899 crypto_ram3 : 5.30 217.40 13.59 0.00 0.00 548263.96 7038.00 470491.49 00:41:19.899 =================================================================================================================== 00:41:19.899 Total : 1297.18 81.07 0.00 0.00 377984.84 5698.78 470491.49 00:41:19.899 00:41:19.899 real 0m6.033s 00:41:19.899 user 0m11.376s 00:41:19.899 sys 0m0.265s 00:41:19.899 17:35:14 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:19.899 17:35:14 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:41:19.899 ************************************ 00:41:19.899 END TEST bdev_verify_big_io 00:41:19.899 ************************************ 00:41:19.899 17:35:14 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:41:19.899 17:35:14 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:41:19.899 17:35:14 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:41:19.899 17:35:14 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:19.899 17:35:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:19.899 ************************************ 00:41:19.899 START TEST bdev_write_zeroes 00:41:19.899 ************************************ 00:41:19.899 17:35:14 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:41:19.899 [2024-07-23 17:35:14.398470] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:19.899 [2024-07-23 17:35:14.398534] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140092 ] 00:41:19.899 [2024-07-23 17:35:14.533244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:19.899 [2024-07-23 17:35:14.589990] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:19.899 [2024-07-23 17:35:14.770974] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:41:19.899 [2024-07-23 17:35:14.771051] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:41:19.899 [2024-07-23 17:35:14.771067] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:19.899 [2024-07-23 17:35:14.778991] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:41:19.899 [2024-07-23 17:35:14.779012] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:41:19.899 [2024-07-23 17:35:14.779023] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:19.899 [2024-07-23 17:35:14.787012] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:41:19.899 [2024-07-23 17:35:14.787032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:41:19.899 [2024-07-23 17:35:14.787045] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:19.899 Running I/O for 1 seconds... 00:41:20.468 00:41:20.468 Latency(us) 00:41:20.468 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:20.468 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:41:20.468 crypto_ram : 1.01 26404.87 103.14 0.00 0.00 4836.04 1303.60 6496.61 00:41:20.468 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:41:20.468 crypto_ram3 : 1.01 13175.72 51.47 0.00 0.00 9648.92 5983.72 9801.91 00:41:20.468 =================================================================================================================== 00:41:20.468 Total : 39580.59 154.61 0.00 0.00 6440.33 1303.60 9801.91 00:41:20.727 00:41:20.727 real 0m1.709s 00:41:20.727 user 0m1.439s 00:41:20.727 sys 0m0.241s 00:41:20.727 17:35:16 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:20.727 17:35:16 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:41:20.727 ************************************ 00:41:20.727 END TEST bdev_write_zeroes 00:41:20.727 ************************************ 00:41:20.727 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:41:20.727 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:41:20.727 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:41:20.727 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:20.727 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:20.727 ************************************ 00:41:20.727 START TEST bdev_json_nonenclosed 00:41:20.727 ************************************ 00:41:20.727 17:35:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:41:20.987 [2024-07-23 17:35:16.189412] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:20.987 [2024-07-23 17:35:16.189477] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140420 ] 00:41:20.987 [2024-07-23 17:35:16.320463] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:20.987 [2024-07-23 17:35:16.373540] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:20.987 [2024-07-23 17:35:16.373613] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:41:20.987 [2024-07-23 17:35:16.373631] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:41:20.987 [2024-07-23 17:35:16.373644] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:21.245 00:41:21.245 real 0m0.336s 00:41:21.245 user 0m0.181s 00:41:21.245 sys 0m0.152s 00:41:21.245 17:35:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:41:21.245 17:35:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:21.245 17:35:16 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:41:21.245 ************************************ 00:41:21.245 END TEST bdev_json_nonenclosed 00:41:21.245 ************************************ 00:41:21.245 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:41:21.245 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # true 00:41:21.245 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:41:21.245 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:41:21.245 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:21.245 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:21.245 ************************************ 00:41:21.245 START TEST bdev_json_nonarray 00:41:21.245 ************************************ 00:41:21.245 17:35:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:41:21.245 [2024-07-23 17:35:16.617196] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:21.245 [2024-07-23 17:35:16.617282] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140473 ] 00:41:21.504 [2024-07-23 17:35:16.766018] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:21.504 [2024-07-23 17:35:16.823508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:21.504 [2024-07-23 17:35:16.823589] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:41:21.504 [2024-07-23 17:35:16.823607] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:41:21.504 [2024-07-23 17:35:16.823620] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:41:21.504 00:41:21.504 real 0m0.360s 00:41:21.504 user 0m0.181s 00:41:21.504 sys 0m0.177s 00:41:21.504 17:35:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:41:21.504 17:35:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:21.504 17:35:16 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:41:21.504 ************************************ 00:41:21.504 END TEST bdev_json_nonarray 00:41:21.504 ************************************ 00:41:21.763 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:41:21.763 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # true 00:41:21.763 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:41:21.763 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:41:21.763 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:41:21.763 17:35:16 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:41:21.763 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:41:21.763 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:21.763 17:35:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:21.763 ************************************ 00:41:21.763 START TEST bdev_crypto_enomem 00:41:21.763 ************************************ 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=140501 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 140501 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 140501 ']' 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:41:21.763 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:41:21.763 17:35:17 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:21.763 [2024-07-23 17:35:17.069822] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:21.763 [2024-07-23 17:35:17.069907] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140501 ] 00:41:22.023 [2024-07-23 17:35:17.209818] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:22.023 [2024-07-23 17:35:17.280698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:22.961 true 00:41:22.961 base0 00:41:22.961 true 00:41:22.961 [2024-07-23 17:35:18.051882] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:41:22.961 crypt0 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:22.961 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:22.961 [ 00:41:22.961 { 00:41:22.961 "name": "crypt0", 00:41:22.961 "aliases": [ 00:41:22.961 "337e54e8-5f5b-5f44-9fd9-2a50fbf99f90" 00:41:22.961 ], 00:41:22.961 "product_name": "crypto", 00:41:22.961 "block_size": 512, 00:41:22.961 "num_blocks": 2097152, 00:41:22.961 "uuid": "337e54e8-5f5b-5f44-9fd9-2a50fbf99f90", 00:41:22.961 "assigned_rate_limits": { 00:41:22.961 "rw_ios_per_sec": 0, 00:41:22.961 "rw_mbytes_per_sec": 0, 00:41:22.961 "r_mbytes_per_sec": 0, 00:41:22.961 "w_mbytes_per_sec": 0 00:41:22.961 }, 00:41:22.961 "claimed": false, 00:41:22.961 "zoned": false, 00:41:22.961 "supported_io_types": { 00:41:22.961 "read": true, 00:41:22.961 "write": true, 00:41:22.961 "unmap": false, 00:41:22.961 "flush": false, 00:41:22.961 "reset": true, 00:41:22.961 "nvme_admin": false, 00:41:22.961 "nvme_io": false, 00:41:22.961 "nvme_io_md": false, 00:41:22.961 "write_zeroes": true, 00:41:22.961 "zcopy": false, 00:41:22.961 "get_zone_info": false, 00:41:22.961 "zone_management": false, 00:41:22.961 "zone_append": false, 00:41:22.961 "compare": false, 00:41:22.961 "compare_and_write": false, 00:41:22.961 "abort": false, 00:41:22.961 "seek_hole": false, 00:41:22.961 "seek_data": false, 00:41:22.961 "copy": false, 00:41:22.961 "nvme_iov_md": false 00:41:22.961 }, 00:41:22.961 "memory_domains": [ 00:41:22.961 { 00:41:22.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:22.961 "dma_device_type": 2 00:41:22.961 } 00:41:22.961 ], 00:41:22.962 "driver_specific": { 00:41:22.962 "crypto": { 00:41:22.962 "base_bdev_name": "EE_base0", 00:41:22.962 "name": "crypt0", 00:41:22.962 "key_name": "test_dek_sw" 00:41:22.962 } 00:41:22.962 } 00:41:22.962 } 00:41:22.962 ] 00:41:22.962 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:22.962 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:41:22.962 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=140678 00:41:22.962 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:41:22.962 17:35:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:41:22.962 Running I/O for 5 seconds... 00:41:23.900 17:35:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:41:23.900 17:35:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:23.900 17:35:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:23.900 17:35:19 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:23.900 17:35:19 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 140678 00:41:28.091 00:41:28.091 Latency(us) 00:41:28.091 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:28.091 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:41:28.091 crypt0 : 5.00 27943.75 109.16 0.00 0.00 1140.50 555.63 2122.80 00:41:28.091 =================================================================================================================== 00:41:28.091 Total : 27943.75 109.16 0.00 0.00 1140.50 555.63 2122.80 00:41:28.091 0 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 140501 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 140501 ']' 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 140501 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 140501 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 140501' 00:41:28.091 killing process with pid 140501 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 140501 00:41:28.091 Received shutdown signal, test time was about 5.000000 seconds 00:41:28.091 00:41:28.091 Latency(us) 00:41:28.091 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:41:28.091 =================================================================================================================== 00:41:28.091 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:41:28.091 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 140501 00:41:28.349 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:41:28.349 00:41:28.349 real 0m6.552s 00:41:28.349 user 0m6.769s 00:41:28.349 sys 0m0.444s 00:41:28.349 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:28.349 17:35:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:41:28.349 ************************************ 00:41:28.349 END TEST bdev_crypto_enomem 00:41:28.349 ************************************ 00:41:28.349 17:35:23 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:41:28.349 17:35:23 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:41:28.349 00:41:28.349 real 0m54.708s 00:41:28.349 user 1m34.044s 00:41:28.349 sys 0m6.894s 00:41:28.349 17:35:23 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:28.349 17:35:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:41:28.349 ************************************ 00:41:28.349 END TEST blockdev_crypto_sw 00:41:28.349 ************************************ 00:41:28.349 17:35:23 -- common/autotest_common.sh@1142 -- # return 0 00:41:28.349 17:35:23 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:41:28.349 17:35:23 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:41:28.349 17:35:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:28.349 17:35:23 -- common/autotest_common.sh@10 -- # set +x 00:41:28.349 ************************************ 00:41:28.349 START TEST blockdev_crypto_qat 00:41:28.349 ************************************ 00:41:28.349 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:41:28.608 * Looking for test storage... 00:41:28.608 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=141435 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 141435 00:41:28.608 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 141435 ']' 00:41:28.608 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:28.608 17:35:23 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:41:28.608 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:41:28.608 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:41:28.608 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:41:28.608 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:41:28.608 17:35:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:28.608 [2024-07-23 17:35:23.896963] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:28.608 [2024-07-23 17:35:23.897045] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid141435 ] 00:41:28.608 [2024-07-23 17:35:24.028486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:28.866 [2024-07-23 17:35:24.079702] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:29.433 17:35:24 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:41:29.433 17:35:24 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:41:29.433 17:35:24 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:41:29.433 17:35:24 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:41:29.433 17:35:24 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:41:29.433 17:35:24 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:29.433 17:35:24 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:29.433 [2024-07-23 17:35:24.854113] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:41:29.691 [2024-07-23 17:35:24.862148] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:41:29.691 [2024-07-23 17:35:24.870167] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:41:29.691 [2024-07-23 17:35:24.939391] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:41:32.224 true 00:41:32.224 true 00:41:32.224 true 00:41:32.224 true 00:41:32.224 Malloc0 00:41:32.224 Malloc1 00:41:32.224 Malloc2 00:41:32.224 Malloc3 00:41:32.224 [2024-07-23 17:35:27.505466] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:41:32.224 crypto_ram 00:41:32.224 [2024-07-23 17:35:27.513481] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:41:32.224 crypto_ram1 00:41:32.224 [2024-07-23 17:35:27.521503] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:41:32.224 crypto_ram2 00:41:32.224 [2024-07-23 17:35:27.529523] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:41:32.224 crypto_ram3 00:41:32.224 [ 00:41:32.224 { 00:41:32.224 "name": "Malloc1", 00:41:32.224 "aliases": [ 00:41:32.224 "e067be1c-2e5f-4b13-9561-1caa52541ee7" 00:41:32.224 ], 00:41:32.224 "product_name": "Malloc disk", 00:41:32.224 "block_size": 512, 00:41:32.224 "num_blocks": 65536, 00:41:32.224 "uuid": "e067be1c-2e5f-4b13-9561-1caa52541ee7", 00:41:32.224 "assigned_rate_limits": { 00:41:32.224 "rw_ios_per_sec": 0, 00:41:32.224 "rw_mbytes_per_sec": 0, 00:41:32.224 "r_mbytes_per_sec": 0, 00:41:32.224 "w_mbytes_per_sec": 0 00:41:32.224 }, 00:41:32.224 "claimed": true, 00:41:32.224 "claim_type": "exclusive_write", 00:41:32.224 "zoned": false, 00:41:32.224 "supported_io_types": { 00:41:32.224 "read": true, 00:41:32.224 "write": true, 00:41:32.224 "unmap": true, 00:41:32.224 "flush": true, 00:41:32.224 "reset": true, 00:41:32.224 "nvme_admin": false, 00:41:32.224 "nvme_io": false, 00:41:32.224 "nvme_io_md": false, 00:41:32.224 "write_zeroes": true, 00:41:32.224 "zcopy": true, 00:41:32.224 "get_zone_info": false, 00:41:32.224 "zone_management": false, 00:41:32.224 "zone_append": false, 00:41:32.224 "compare": false, 00:41:32.224 "compare_and_write": false, 00:41:32.224 "abort": true, 00:41:32.224 "seek_hole": false, 00:41:32.224 "seek_data": false, 00:41:32.224 "copy": true, 00:41:32.224 "nvme_iov_md": false 00:41:32.224 }, 00:41:32.224 "memory_domains": [ 00:41:32.224 { 00:41:32.224 "dma_device_id": "system", 00:41:32.224 "dma_device_type": 1 00:41:32.224 }, 00:41:32.224 { 00:41:32.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:41:32.224 "dma_device_type": 2 00:41:32.224 } 00:41:32.224 ], 00:41:32.224 "driver_specific": {} 00:41:32.224 } 00:41:32.224 ] 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:32.224 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:32.224 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:41:32.224 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:32.224 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:32.224 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:41:32.225 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:32.225 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:32.225 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:32.225 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:41:32.225 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:32.225 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d4d4fe1e-f9f9-54ef-8b60-fa61cd0bcbdf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d4d4fe1e-f9f9-54ef-8b60-fa61cd0bcbdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "7f608c1f-7a00-53cc-bbc9-9c1a59c22772"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7f608c1f-7a00-53cc-bbc9-9c1a59c22772",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2dbf549b-4152-5650-b11b-66f4beace959"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2dbf549b-4152-5650-b11b-66f4beace959",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f5c17bd1-0839-5070-8687-3211f94e1b80"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f5c17bd1-0839-5070-8687-3211f94e1b80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:41:32.484 17:35:27 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 141435 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 141435 ']' 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 141435 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 141435 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 141435' 00:41:32.484 killing process with pid 141435 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 141435 00:41:32.484 17:35:27 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 141435 00:41:33.050 17:35:28 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:41:33.050 17:35:28 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:41:33.050 17:35:28 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:41:33.050 17:35:28 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:33.050 17:35:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:33.050 ************************************ 00:41:33.050 START TEST bdev_hello_world 00:41:33.050 ************************************ 00:41:33.050 17:35:28 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:41:33.309 [2024-07-23 17:35:28.490016] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:33.309 [2024-07-23 17:35:28.490081] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid141985 ] 00:41:33.309 [2024-07-23 17:35:28.621780] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:33.309 [2024-07-23 17:35:28.673807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:33.309 [2024-07-23 17:35:28.695188] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:41:33.309 [2024-07-23 17:35:28.703216] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:41:33.309 [2024-07-23 17:35:28.711232] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:41:33.568 [2024-07-23 17:35:28.827556] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:41:36.136 [2024-07-23 17:35:31.222653] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:41:36.136 [2024-07-23 17:35:31.222724] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:41:36.136 [2024-07-23 17:35:31.222739] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:36.136 [2024-07-23 17:35:31.230671] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:41:36.136 [2024-07-23 17:35:31.230691] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:41:36.136 [2024-07-23 17:35:31.230702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:36.136 [2024-07-23 17:35:31.238691] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:41:36.136 [2024-07-23 17:35:31.238711] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:41:36.136 [2024-07-23 17:35:31.238722] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:36.136 [2024-07-23 17:35:31.246712] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:41:36.136 [2024-07-23 17:35:31.246731] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:41:36.136 [2024-07-23 17:35:31.246742] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:36.136 [2024-07-23 17:35:31.324770] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:41:36.136 [2024-07-23 17:35:31.324817] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:41:36.136 [2024-07-23 17:35:31.324836] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:41:36.136 [2024-07-23 17:35:31.326294] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:41:36.136 [2024-07-23 17:35:31.326373] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:41:36.136 [2024-07-23 17:35:31.326391] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:41:36.136 [2024-07-23 17:35:31.326435] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:41:36.136 00:41:36.136 [2024-07-23 17:35:31.326454] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:41:36.396 00:41:36.396 real 0m3.285s 00:41:36.396 user 0m2.678s 00:41:36.396 sys 0m0.564s 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:41:36.396 ************************************ 00:41:36.396 END TEST bdev_hello_world 00:41:36.396 ************************************ 00:41:36.396 17:35:31 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:41:36.396 17:35:31 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:41:36.396 17:35:31 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:41:36.396 17:35:31 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:36.396 17:35:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:36.396 ************************************ 00:41:36.396 START TEST bdev_bounds 00:41:36.396 ************************************ 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=142516 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 142516' 00:41:36.396 Process bdevio pid: 142516 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 142516 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 142516 ']' 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:41:36.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:41:36.396 17:35:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:41:36.656 [2024-07-23 17:35:31.870389] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:36.656 [2024-07-23 17:35:31.870459] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid142516 ] 00:41:36.656 [2024-07-23 17:35:32.002327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:41:36.656 [2024-07-23 17:35:32.060738] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:41:36.656 [2024-07-23 17:35:32.060837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:41:36.656 [2024-07-23 17:35:32.060838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:36.915 [2024-07-23 17:35:32.082367] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:41:36.915 [2024-07-23 17:35:32.090399] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:41:36.915 [2024-07-23 17:35:32.098419] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:41:36.915 [2024-07-23 17:35:32.211973] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:41:39.450 [2024-07-23 17:35:34.585474] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:41:39.450 [2024-07-23 17:35:34.585536] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:41:39.450 [2024-07-23 17:35:34.585551] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:39.450 [2024-07-23 17:35:34.593490] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:41:39.450 [2024-07-23 17:35:34.593511] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:41:39.450 [2024-07-23 17:35:34.593523] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:39.450 [2024-07-23 17:35:34.601519] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:41:39.450 [2024-07-23 17:35:34.601538] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:41:39.450 [2024-07-23 17:35:34.601550] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:39.450 [2024-07-23 17:35:34.609540] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:41:39.450 [2024-07-23 17:35:34.609559] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:41:39.450 [2024-07-23 17:35:34.609570] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:39.450 17:35:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:41:39.450 17:35:34 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:41:39.450 17:35:34 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:41:39.450 I/O targets: 00:41:39.450 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:41:39.450 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:41:39.450 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:41:39.450 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:41:39.450 00:41:39.450 00:41:39.450 CUnit - A unit testing framework for C - Version 2.1-3 00:41:39.450 http://cunit.sourceforge.net/ 00:41:39.450 00:41:39.450 00:41:39.450 Suite: bdevio tests on: crypto_ram3 00:41:39.450 Test: blockdev write read block ...passed 00:41:39.450 Test: blockdev write zeroes read block ...passed 00:41:39.450 Test: blockdev write zeroes read no split ...passed 00:41:39.450 Test: blockdev write zeroes read split ...passed 00:41:39.709 Test: blockdev write zeroes read split partial ...passed 00:41:39.709 Test: blockdev reset ...passed 00:41:39.709 Test: blockdev write read 8 blocks ...passed 00:41:39.709 Test: blockdev write read size > 128k ...passed 00:41:39.709 Test: blockdev write read invalid size ...passed 00:41:39.709 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:41:39.709 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:41:39.709 Test: blockdev write read max offset ...passed 00:41:39.709 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:41:39.709 Test: blockdev writev readv 8 blocks ...passed 00:41:39.709 Test: blockdev writev readv 30 x 1block ...passed 00:41:39.709 Test: blockdev writev readv block ...passed 00:41:39.709 Test: blockdev writev readv size > 128k ...passed 00:41:39.709 Test: blockdev writev readv size > 128k in two iovs ...passed 00:41:39.709 Test: blockdev comparev and writev ...passed 00:41:39.709 Test: blockdev nvme passthru rw ...passed 00:41:39.709 Test: blockdev nvme passthru vendor specific ...passed 00:41:39.709 Test: blockdev nvme admin passthru ...passed 00:41:39.709 Test: blockdev copy ...passed 00:41:39.709 Suite: bdevio tests on: crypto_ram2 00:41:39.709 Test: blockdev write read block ...passed 00:41:39.709 Test: blockdev write zeroes read block ...passed 00:41:39.709 Test: blockdev write zeroes read no split ...passed 00:41:39.709 Test: blockdev write zeroes read split ...passed 00:41:39.709 Test: blockdev write zeroes read split partial ...passed 00:41:39.709 Test: blockdev reset ...passed 00:41:39.709 Test: blockdev write read 8 blocks ...passed 00:41:39.709 Test: blockdev write read size > 128k ...passed 00:41:39.709 Test: blockdev write read invalid size ...passed 00:41:39.709 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:41:39.709 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:41:39.709 Test: blockdev write read max offset ...passed 00:41:39.709 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:41:39.709 Test: blockdev writev readv 8 blocks ...passed 00:41:39.709 Test: blockdev writev readv 30 x 1block ...passed 00:41:39.709 Test: blockdev writev readv block ...passed 00:41:39.709 Test: blockdev writev readv size > 128k ...passed 00:41:39.709 Test: blockdev writev readv size > 128k in two iovs ...passed 00:41:39.709 Test: blockdev comparev and writev ...passed 00:41:39.709 Test: blockdev nvme passthru rw ...passed 00:41:39.709 Test: blockdev nvme passthru vendor specific ...passed 00:41:39.709 Test: blockdev nvme admin passthru ...passed 00:41:39.709 Test: blockdev copy ...passed 00:41:39.709 Suite: bdevio tests on: crypto_ram1 00:41:39.709 Test: blockdev write read block ...passed 00:41:39.709 Test: blockdev write zeroes read block ...passed 00:41:39.709 Test: blockdev write zeroes read no split ...passed 00:41:39.709 Test: blockdev write zeroes read split ...passed 00:41:39.968 Test: blockdev write zeroes read split partial ...passed 00:41:39.968 Test: blockdev reset ...passed 00:41:39.968 Test: blockdev write read 8 blocks ...passed 00:41:39.968 Test: blockdev write read size > 128k ...passed 00:41:39.968 Test: blockdev write read invalid size ...passed 00:41:39.968 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:41:39.968 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:41:39.968 Test: blockdev write read max offset ...passed 00:41:39.968 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:41:39.968 Test: blockdev writev readv 8 blocks ...passed 00:41:39.968 Test: blockdev writev readv 30 x 1block ...passed 00:41:39.968 Test: blockdev writev readv block ...passed 00:41:39.968 Test: blockdev writev readv size > 128k ...passed 00:41:39.968 Test: blockdev writev readv size > 128k in two iovs ...passed 00:41:39.968 Test: blockdev comparev and writev ...passed 00:41:39.968 Test: blockdev nvme passthru rw ...passed 00:41:39.968 Test: blockdev nvme passthru vendor specific ...passed 00:41:39.968 Test: blockdev nvme admin passthru ...passed 00:41:39.968 Test: blockdev copy ...passed 00:41:39.968 Suite: bdevio tests on: crypto_ram 00:41:39.968 Test: blockdev write read block ...passed 00:41:39.968 Test: blockdev write zeroes read block ...passed 00:41:39.968 Test: blockdev write zeroes read no split ...passed 00:41:40.227 Test: blockdev write zeroes read split ...passed 00:41:40.227 Test: blockdev write zeroes read split partial ...passed 00:41:40.227 Test: blockdev reset ...passed 00:41:40.227 Test: blockdev write read 8 blocks ...passed 00:41:40.227 Test: blockdev write read size > 128k ...passed 00:41:40.227 Test: blockdev write read invalid size ...passed 00:41:40.227 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:41:40.227 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:41:40.227 Test: blockdev write read max offset ...passed 00:41:40.227 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:41:40.227 Test: blockdev writev readv 8 blocks ...passed 00:41:40.227 Test: blockdev writev readv 30 x 1block ...passed 00:41:40.227 Test: blockdev writev readv block ...passed 00:41:40.227 Test: blockdev writev readv size > 128k ...passed 00:41:40.227 Test: blockdev writev readv size > 128k in two iovs ...passed 00:41:40.227 Test: blockdev comparev and writev ...passed 00:41:40.227 Test: blockdev nvme passthru rw ...passed 00:41:40.227 Test: blockdev nvme passthru vendor specific ...passed 00:41:40.227 Test: blockdev nvme admin passthru ...passed 00:41:40.227 Test: blockdev copy ...passed 00:41:40.227 00:41:40.227 Run Summary: Type Total Ran Passed Failed Inactive 00:41:40.227 suites 4 4 n/a 0 0 00:41:40.227 tests 92 92 92 0 0 00:41:40.227 asserts 520 520 520 0 n/a 00:41:40.227 00:41:40.227 Elapsed time = 1.487 seconds 00:41:40.227 0 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 142516 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 142516 ']' 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 142516 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 142516 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 142516' 00:41:40.227 killing process with pid 142516 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 142516 00:41:40.227 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 142516 00:41:40.796 17:35:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:41:40.796 00:41:40.796 real 0m4.181s 00:41:40.796 user 0m11.219s 00:41:40.796 sys 0m0.748s 00:41:40.796 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:40.796 17:35:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:41:40.796 ************************************ 00:41:40.796 END TEST bdev_bounds 00:41:40.796 ************************************ 00:41:40.796 17:35:36 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:41:40.796 17:35:36 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:41:40.796 17:35:36 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:41:40.796 17:35:36 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:40.796 17:35:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:40.796 ************************************ 00:41:40.796 START TEST bdev_nbd 00:41:40.796 ************************************ 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=143071 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 143071 /var/tmp/spdk-nbd.sock 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 143071 ']' 00:41:40.796 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:41:40.797 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:41:40.797 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:41:40.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:41:40.797 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:41:40.797 17:35:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:41:40.797 [2024-07-23 17:35:36.146889] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:41:40.797 [2024-07-23 17:35:36.146963] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:41:41.056 [2024-07-23 17:35:36.272149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:41:41.056 [2024-07-23 17:35:36.327012] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:41:41.056 [2024-07-23 17:35:36.348380] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:41:41.056 [2024-07-23 17:35:36.356410] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:41:41.056 [2024-07-23 17:35:36.364427] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:41:41.056 [2024-07-23 17:35:36.470269] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:41:43.593 [2024-07-23 17:35:38.867605] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:41:43.593 [2024-07-23 17:35:38.867671] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:41:43.593 [2024-07-23 17:35:38.867690] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:43.593 [2024-07-23 17:35:38.875625] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:41:43.593 [2024-07-23 17:35:38.875645] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:41:43.593 [2024-07-23 17:35:38.875657] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:43.593 [2024-07-23 17:35:38.883646] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:41:43.593 [2024-07-23 17:35:38.883665] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:41:43.593 [2024-07-23 17:35:38.883678] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:43.593 [2024-07-23 17:35:38.891665] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:41:43.593 [2024-07-23 17:35:38.891692] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:41:43.593 [2024-07-23 17:35:38.891704] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:41:43.593 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:41:43.853 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:44.112 1+0 records in 00:41:44.112 1+0 records out 00:41:44.112 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312957 s, 13.1 MB/s 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:41:44.112 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:44.372 1+0 records in 00:41:44.372 1+0 records out 00:41:44.372 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000326658 s, 12.5 MB/s 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:41:44.372 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:44.631 1+0 records in 00:41:44.631 1+0 records out 00:41:44.631 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328327 s, 12.5 MB/s 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:41:44.631 17:35:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:44.891 1+0 records in 00:41:44.891 1+0 records out 00:41:44.891 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000365338 s, 11.2 MB/s 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:41:44.891 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd0", 00:41:45.150 "bdev_name": "crypto_ram" 00:41:45.150 }, 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd1", 00:41:45.150 "bdev_name": "crypto_ram1" 00:41:45.150 }, 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd2", 00:41:45.150 "bdev_name": "crypto_ram2" 00:41:45.150 }, 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd3", 00:41:45.150 "bdev_name": "crypto_ram3" 00:41:45.150 } 00:41:45.150 ]' 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd0", 00:41:45.150 "bdev_name": "crypto_ram" 00:41:45.150 }, 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd1", 00:41:45.150 "bdev_name": "crypto_ram1" 00:41:45.150 }, 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd2", 00:41:45.150 "bdev_name": "crypto_ram2" 00:41:45.150 }, 00:41:45.150 { 00:41:45.150 "nbd_device": "/dev/nbd3", 00:41:45.150 "bdev_name": "crypto_ram3" 00:41:45.150 } 00:41:45.150 ]' 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:45.150 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:45.410 17:35:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:45.670 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:45.928 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:46.188 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:41:46.447 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:41:46.706 17:35:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:41:46.706 /dev/nbd0 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:46.965 1+0 records in 00:41:46.965 1+0 records out 00:41:46.965 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312997 s, 13.1 MB/s 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:41:46.965 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:41:47.224 /dev/nbd1 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:47.224 1+0 records in 00:41:47.224 1+0 records out 00:41:47.224 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000349452 s, 11.7 MB/s 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:41:47.224 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:41:47.484 /dev/nbd10 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:47.484 1+0 records in 00:41:47.484 1+0 records out 00:41:47.484 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276535 s, 14.8 MB/s 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:41:47.484 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:41:47.743 /dev/nbd11 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:41:47.743 1+0 records in 00:41:47.743 1+0 records out 00:41:47.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000376091 s, 10.9 MB/s 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:47.743 17:35:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd0", 00:41:48.003 "bdev_name": "crypto_ram" 00:41:48.003 }, 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd1", 00:41:48.003 "bdev_name": "crypto_ram1" 00:41:48.003 }, 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd10", 00:41:48.003 "bdev_name": "crypto_ram2" 00:41:48.003 }, 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd11", 00:41:48.003 "bdev_name": "crypto_ram3" 00:41:48.003 } 00:41:48.003 ]' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd0", 00:41:48.003 "bdev_name": "crypto_ram" 00:41:48.003 }, 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd1", 00:41:48.003 "bdev_name": "crypto_ram1" 00:41:48.003 }, 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd10", 00:41:48.003 "bdev_name": "crypto_ram2" 00:41:48.003 }, 00:41:48.003 { 00:41:48.003 "nbd_device": "/dev/nbd11", 00:41:48.003 "bdev_name": "crypto_ram3" 00:41:48.003 } 00:41:48.003 ]' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:41:48.003 /dev/nbd1 00:41:48.003 /dev/nbd10 00:41:48.003 /dev/nbd11' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:41:48.003 /dev/nbd1 00:41:48.003 /dev/nbd10 00:41:48.003 /dev/nbd11' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:41:48.003 256+0 records in 00:41:48.003 256+0 records out 00:41:48.003 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109406 s, 95.8 MB/s 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:41:48.003 256+0 records in 00:41:48.003 256+0 records out 00:41:48.003 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0827652 s, 12.7 MB/s 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:41:48.003 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:41:48.261 256+0 records in 00:41:48.261 256+0 records out 00:41:48.261 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.065196 s, 16.1 MB/s 00:41:48.261 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:41:48.261 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:41:48.261 256+0 records in 00:41:48.261 256+0 records out 00:41:48.261 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0581239 s, 18.0 MB/s 00:41:48.261 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:41:48.261 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:41:48.261 256+0 records in 00:41:48.261 256+0 records out 00:41:48.261 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0552737 s, 19.0 MB/s 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:48.262 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:48.520 17:35:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:48.778 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:48.779 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:49.037 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:41:49.296 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:49.555 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:41:49.814 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:41:49.814 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:41:49.814 17:35:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:41:49.814 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:41:50.073 malloc_lvol_verify 00:41:50.073 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:41:50.332 15377ac0-7ad5-482d-83ea-c2593d778cb0 00:41:50.332 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:41:50.591 4650c68e-75b1-4986-a0cc-d28aff18ed49 00:41:50.591 17:35:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:41:50.850 /dev/nbd0 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:41:50.850 mke2fs 1.46.5 (30-Dec-2021) 00:41:50.850 Discarding device blocks: 0/4096 done 00:41:50.850 Creating filesystem with 4096 1k blocks and 1024 inodes 00:41:50.850 00:41:50.850 Allocating group tables: 0/1 done 00:41:50.850 Writing inode tables: 0/1 done 00:41:50.850 Creating journal (1024 blocks): done 00:41:50.850 Writing superblocks and filesystem accounting information: 0/1 done 00:41:50.850 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:41:50.850 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 143071 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 143071 ']' 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 143071 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 143071 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 143071' 00:41:51.109 killing process with pid 143071 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 143071 00:41:51.109 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 143071 00:41:51.398 17:35:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:41:51.398 00:41:51.398 real 0m10.661s 00:41:51.398 user 0m13.762s 00:41:51.398 sys 0m4.301s 00:41:51.398 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:41:51.398 17:35:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:41:51.398 ************************************ 00:41:51.398 END TEST bdev_nbd 00:41:51.398 ************************************ 00:41:51.398 17:35:46 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:41:51.398 17:35:46 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:41:51.398 17:35:46 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:41:51.398 17:35:46 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:41:51.398 17:35:46 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:41:51.398 17:35:46 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:41:51.398 17:35:46 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:51.398 17:35:46 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:41:51.657 ************************************ 00:41:51.657 START TEST bdev_fio 00:41:51.657 ************************************ 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:41:51.657 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:41:51.657 ************************************ 00:41:51.657 START TEST bdev_fio_rw_verify 00:41:51.657 ************************************ 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:41:51.657 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:41:51.658 17:35:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:41:51.658 17:35:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:41:51.658 17:35:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:41:51.658 17:35:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:41:51.658 17:35:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:41:51.917 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:41:51.917 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:41:51.917 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:41:51.917 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:41:51.917 fio-3.35 00:41:51.917 Starting 4 threads 00:42:06.803 00:42:06.803 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=145103: Tue Jul 23 17:36:00 2024 00:42:06.803 read: IOPS=21.6k, BW=84.4MiB/s (88.5MB/s)(844MiB/10001msec) 00:42:06.803 slat (usec): min=17, max=533, avg=64.12, stdev=43.36 00:42:06.803 clat (usec): min=18, max=1796, avg=355.34, stdev=245.13 00:42:06.803 lat (usec): min=49, max=1962, avg=419.45, stdev=269.97 00:42:06.803 clat percentiles (usec): 00:42:06.803 | 50.000th=[ 277], 99.000th=[ 1188], 99.900th=[ 1385], 99.990th=[ 1483], 00:42:06.803 | 99.999th=[ 1680] 00:42:06.803 write: IOPS=23.7k, BW=92.4MiB/s (96.9MB/s)(899MiB/9725msec); 0 zone resets 00:42:06.803 slat (usec): min=19, max=446, avg=76.94, stdev=42.93 00:42:06.803 clat (usec): min=26, max=3265, avg=398.18, stdev=253.56 00:42:06.803 lat (usec): min=70, max=3341, avg=475.12, stdev=277.58 00:42:06.803 clat percentiles (usec): 00:42:06.803 | 50.000th=[ 334], 99.000th=[ 1270], 99.900th=[ 1467], 99.990th=[ 1696], 00:42:06.803 | 99.999th=[ 2507] 00:42:06.803 bw ( KiB/s): min=78392, max=108277, per=97.66%, avg=92421.74, stdev=2004.98, samples=76 00:42:06.803 iops : min=19598, max=27069, avg=23105.37, stdev=501.22, samples=76 00:42:06.803 lat (usec) : 20=0.01%, 50=0.01%, 100=3.24%, 250=33.98%, 500=39.32% 00:42:06.803 lat (usec) : 750=14.53%, 1000=5.24% 00:42:06.803 lat (msec) : 2=3.68%, 4=0.01% 00:42:06.803 cpu : usr=99.59%, sys=0.00%, ctx=68, majf=0, minf=293 00:42:06.803 IO depths : 1=2.2%, 2=28.0%, 4=55.9%, 8=14.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:42:06.803 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:42:06.803 complete : 0=0.0%, 4=87.7%, 8=12.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:42:06.803 issued rwts: total=216068,230080,0,0 short=0,0,0,0 dropped=0,0,0,0 00:42:06.803 latency : target=0, window=0, percentile=100.00%, depth=8 00:42:06.803 00:42:06.803 Run status group 0 (all jobs): 00:42:06.803 READ: bw=84.4MiB/s (88.5MB/s), 84.4MiB/s-84.4MiB/s (88.5MB/s-88.5MB/s), io=844MiB (885MB), run=10001-10001msec 00:42:06.803 WRITE: bw=92.4MiB/s (96.9MB/s), 92.4MiB/s-92.4MiB/s (96.9MB/s-96.9MB/s), io=899MiB (942MB), run=9725-9725msec 00:42:06.803 00:42:06.803 real 0m13.766s 00:42:06.803 user 0m45.921s 00:42:06.803 sys 0m0.726s 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:42:06.803 ************************************ 00:42:06.803 END TEST bdev_fio_rw_verify 00:42:06.803 ************************************ 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d4d4fe1e-f9f9-54ef-8b60-fa61cd0bcbdf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d4d4fe1e-f9f9-54ef-8b60-fa61cd0bcbdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "7f608c1f-7a00-53cc-bbc9-9c1a59c22772"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7f608c1f-7a00-53cc-bbc9-9c1a59c22772",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2dbf549b-4152-5650-b11b-66f4beace959"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2dbf549b-4152-5650-b11b-66f4beace959",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f5c17bd1-0839-5070-8687-3211f94e1b80"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f5c17bd1-0839-5070-8687-3211f94e1b80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:42:06.803 crypto_ram1 00:42:06.803 crypto_ram2 00:42:06.803 crypto_ram3 ]] 00:42:06.803 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "d4d4fe1e-f9f9-54ef-8b60-fa61cd0bcbdf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d4d4fe1e-f9f9-54ef-8b60-fa61cd0bcbdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "7f608c1f-7a00-53cc-bbc9-9c1a59c22772"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7f608c1f-7a00-53cc-bbc9-9c1a59c22772",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "2dbf549b-4152-5650-b11b-66f4beace959"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "2dbf549b-4152-5650-b11b-66f4beace959",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f5c17bd1-0839-5070-8687-3211f94e1b80"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f5c17bd1-0839-5070-8687-3211f94e1b80",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:42:06.804 ************************************ 00:42:06.804 START TEST bdev_fio_trim 00:42:06.804 ************************************ 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:42:06.804 17:36:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:42:06.804 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:42:06.804 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:42:06.804 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:42:06.804 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:42:06.804 fio-3.35 00:42:06.804 Starting 4 threads 00:42:19.016 00:42:19.016 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=147081: Tue Jul 23 17:36:14 2024 00:42:19.016 write: IOPS=34.7k, BW=136MiB/s (142MB/s)(1357MiB/10001msec); 0 zone resets 00:42:19.016 slat (usec): min=11, max=552, avg=68.01, stdev=35.39 00:42:19.016 clat (usec): min=19, max=1922, avg=238.46, stdev=131.92 00:42:19.016 lat (usec): min=41, max=2071, avg=306.47, stdev=148.86 00:42:19.016 clat percentiles (usec): 00:42:19.016 | 50.000th=[ 215], 99.000th=[ 619], 99.900th=[ 758], 99.990th=[ 848], 00:42:19.016 | 99.999th=[ 1156] 00:42:19.017 bw ( KiB/s): min=123360, max=173610, per=100.00%, avg=139006.84, stdev=4125.44, samples=76 00:42:19.017 iops : min=30840, max=43402, avg=34751.68, stdev=1031.35, samples=76 00:42:19.017 trim: IOPS=34.7k, BW=136MiB/s (142MB/s)(1357MiB/10001msec); 0 zone resets 00:42:19.017 slat (usec): min=4, max=1341, avg=19.52, stdev= 7.93 00:42:19.017 clat (usec): min=10, max=2072, avg=306.65, stdev=148.88 00:42:19.017 lat (usec): min=35, max=2096, avg=326.17, stdev=150.75 00:42:19.017 clat percentiles (usec): 00:42:19.017 | 50.000th=[ 277], 99.000th=[ 742], 99.900th=[ 881], 99.990th=[ 1004], 00:42:19.017 | 99.999th=[ 1401] 00:42:19.017 bw ( KiB/s): min=123360, max=173610, per=100.00%, avg=139006.84, stdev=4125.44, samples=76 00:42:19.017 iops : min=30840, max=43402, avg=34751.68, stdev=1031.35, samples=76 00:42:19.017 lat (usec) : 20=0.01%, 50=1.16%, 100=7.35%, 250=42.60%, 500=41.15% 00:42:19.017 lat (usec) : 750=7.24%, 1000=0.50% 00:42:19.017 lat (msec) : 2=0.01%, 4=0.01% 00:42:19.017 cpu : usr=99.61%, sys=0.00%, ctx=60, majf=0, minf=115 00:42:19.017 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:42:19.017 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:42:19.017 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:42:19.017 issued rwts: total=0,347485,347486,0 short=0,0,0,0 dropped=0,0,0,0 00:42:19.017 latency : target=0, window=0, percentile=100.00%, depth=8 00:42:19.017 00:42:19.017 Run status group 0 (all jobs): 00:42:19.017 WRITE: bw=136MiB/s (142MB/s), 136MiB/s-136MiB/s (142MB/s-142MB/s), io=1357MiB (1423MB), run=10001-10001msec 00:42:19.017 TRIM: bw=136MiB/s (142MB/s), 136MiB/s-136MiB/s (142MB/s-142MB/s), io=1357MiB (1423MB), run=10001-10001msec 00:42:19.276 00:42:19.276 real 0m13.676s 00:42:19.276 user 0m45.997s 00:42:19.276 sys 0m0.664s 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:42:19.276 ************************************ 00:42:19.276 END TEST bdev_fio_trim 00:42:19.276 ************************************ 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:42:19.276 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:42:19.276 00:42:19.276 real 0m27.798s 00:42:19.276 user 1m32.112s 00:42:19.276 sys 0m1.577s 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:19.276 17:36:14 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:42:19.276 ************************************ 00:42:19.276 END TEST bdev_fio 00:42:19.276 ************************************ 00:42:19.276 17:36:14 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:42:19.276 17:36:14 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:42:19.276 17:36:14 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:42:19.276 17:36:14 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:42:19.276 17:36:14 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:19.276 17:36:14 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:42:19.535 ************************************ 00:42:19.535 START TEST bdev_verify 00:42:19.535 ************************************ 00:42:19.535 17:36:14 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:42:19.535 [2024-07-23 17:36:14.768664] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:19.535 [2024-07-23 17:36:14.768729] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148732 ] 00:42:19.535 [2024-07-23 17:36:14.903721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:42:19.794 [2024-07-23 17:36:14.965833] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:42:19.794 [2024-07-23 17:36:14.965837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:19.794 [2024-07-23 17:36:14.987299] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:42:19.794 [2024-07-23 17:36:14.995329] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:42:19.794 [2024-07-23 17:36:15.003349] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:42:19.794 [2024-07-23 17:36:15.110450] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:42:22.325 [2024-07-23 17:36:17.503067] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:42:22.325 [2024-07-23 17:36:17.503143] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:42:22.325 [2024-07-23 17:36:17.503158] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:22.325 [2024-07-23 17:36:17.511086] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:42:22.325 [2024-07-23 17:36:17.511105] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:42:22.325 [2024-07-23 17:36:17.511117] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:22.325 [2024-07-23 17:36:17.519109] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:42:22.325 [2024-07-23 17:36:17.519129] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:42:22.325 [2024-07-23 17:36:17.519140] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:22.325 [2024-07-23 17:36:17.527131] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:42:22.325 [2024-07-23 17:36:17.527151] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:42:22.325 [2024-07-23 17:36:17.527162] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:22.325 Running I/O for 5 seconds... 00:42:27.594 00:42:27.594 Latency(us) 00:42:27.594 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:27.594 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:42:27.594 Verification LBA range: start 0x0 length 0x1000 00:42:27.594 crypto_ram : 5.08 472.71 1.85 0.00 0.00 269579.32 3376.53 165036.74 00:42:27.594 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:42:27.594 Verification LBA range: start 0x1000 length 0x1000 00:42:27.594 crypto_ram : 5.07 378.45 1.48 0.00 0.00 337064.95 17096.35 205156.17 00:42:27.594 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:42:27.595 Verification LBA range: start 0x0 length 0x1000 00:42:27.595 crypto_ram1 : 5.08 475.70 1.86 0.00 0.00 267457.91 3647.22 152271.47 00:42:27.595 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:42:27.595 Verification LBA range: start 0x1000 length 0x1000 00:42:27.595 crypto_ram1 : 5.07 378.34 1.48 0.00 0.00 335839.59 18350.08 186920.07 00:42:27.595 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:42:27.595 Verification LBA range: start 0x0 length 0x1000 00:42:27.595 crypto_ram2 : 5.06 3667.37 14.33 0.00 0.00 34602.33 5698.78 28151.99 00:42:27.595 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:42:27.595 Verification LBA range: start 0x1000 length 0x1000 00:42:27.595 crypto_ram2 : 5.06 2959.43 11.56 0.00 0.00 42784.30 9573.95 31913.18 00:42:27.595 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:42:27.595 Verification LBA range: start 0x0 length 0x1000 00:42:27.595 crypto_ram3 : 5.06 3664.52 14.31 0.00 0.00 34530.41 6981.01 28038.01 00:42:27.595 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:42:27.595 Verification LBA range: start 0x1000 length 0x1000 00:42:27.595 crypto_ram3 : 5.07 2967.39 11.59 0.00 0.00 42563.73 1032.90 32824.99 00:42:27.595 =================================================================================================================== 00:42:27.595 Total : 14963.91 58.45 0.00 0.00 67937.85 1032.90 205156.17 00:42:27.854 00:42:27.854 real 0m8.419s 00:42:27.854 user 0m15.813s 00:42:27.854 sys 0m0.575s 00:42:27.854 17:36:23 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:27.854 17:36:23 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:42:27.854 ************************************ 00:42:27.854 END TEST bdev_verify 00:42:27.854 ************************************ 00:42:27.854 17:36:23 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:42:27.854 17:36:23 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:42:27.854 17:36:23 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:42:27.854 17:36:23 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:27.854 17:36:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:42:27.854 ************************************ 00:42:27.854 START TEST bdev_verify_big_io 00:42:27.854 ************************************ 00:42:27.854 17:36:23 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:42:27.854 [2024-07-23 17:36:23.274537] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:27.854 [2024-07-23 17:36:23.274604] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149798 ] 00:42:28.113 [2024-07-23 17:36:23.406873] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:42:28.113 [2024-07-23 17:36:23.461693] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:42:28.113 [2024-07-23 17:36:23.461699] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:28.113 [2024-07-23 17:36:23.483295] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:42:28.113 [2024-07-23 17:36:23.491325] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:42:28.114 [2024-07-23 17:36:23.499345] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:42:28.372 [2024-07-23 17:36:23.607651] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:42:30.991 [2024-07-23 17:36:26.001705] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:42:30.991 [2024-07-23 17:36:26.001783] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:42:30.991 [2024-07-23 17:36:26.001798] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:30.991 [2024-07-23 17:36:26.009720] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:42:30.991 [2024-07-23 17:36:26.009743] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:42:30.991 [2024-07-23 17:36:26.009755] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:30.991 [2024-07-23 17:36:26.017742] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:42:30.991 [2024-07-23 17:36:26.017764] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:42:30.991 [2024-07-23 17:36:26.017776] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:30.991 [2024-07-23 17:36:26.025766] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:42:30.991 [2024-07-23 17:36:26.025791] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:42:30.991 [2024-07-23 17:36:26.025803] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:30.991 Running I/O for 5 seconds... 00:42:31.560 [2024-07-23 17:36:26.968335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.968913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.969010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.969075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.969129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.969182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.969717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.969745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.974976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.975032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.975542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.975565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.979711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.979769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.979833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.979903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.980400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.980457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.980510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.980563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.981086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.560 [2024-07-23 17:36:26.981110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.985338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.985422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.985475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.985528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.986122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.986179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.986232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.986288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.986741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.986764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.990809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.990869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.990955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.991009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.991499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.991557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.991610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.991665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.992190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.992216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.996325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.996409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.996464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.996517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.997091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.997149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.997202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.997256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.997734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:26.997757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.001903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.001981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.002035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.002122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.002693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.002748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.002803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.002856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.003381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.003405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.007678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.822 [2024-07-23 17:36:27.007737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.007790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.007844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.008404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.008462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.008516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.008573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.008901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.008925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.011506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.011565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.011635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.011688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.012072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.012131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.012184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.012243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.012812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.012837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.016982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.017035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.017396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.017419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.020884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.020948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.021001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.021055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.021529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.021585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.021638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.021691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.022049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.022072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.024715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.024773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.024825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.024878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.025470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.025527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.025583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.025637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.026156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.026183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.029335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.029392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.029459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.029512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.029946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.030003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.030064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.030117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.030438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.030461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.034754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.035080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.035103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.037586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.037651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.037703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.037756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.038341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.038398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.038451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.038504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.039015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.039038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.041989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.042049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.042107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.042165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.042528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.042590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.823 [2024-07-23 17:36:27.042643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.042696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.043023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.043046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.046457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.046516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.046573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.046625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.047024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.047101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.047155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.047207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.047526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.047549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.050981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.051035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.051488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.051511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.054933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.055257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.055279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.058753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.058814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.058872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.058930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.059341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.059400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.059452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.059505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.059827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.059849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.062241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.062298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.062356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.062409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.062998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.063055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.063108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.063162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.063683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.063707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.066553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.066617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.066671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.066724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.067098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.067154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.067215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.067274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.067596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.067619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.070848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.070910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.070963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.071924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.074366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.074423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.074475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.074527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.075018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.075075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.075128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.075182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.075711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.824 [2024-07-23 17:36:27.075734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.078663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.078720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.078772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.078824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.079384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.079440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.079491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.079543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.079912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.079935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.083954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.084311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.084333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.086580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.086637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.086689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.086740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.087107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.087170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.087226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.087279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.087814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.087838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.090816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.090873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.090930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.090984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.091346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.091408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.091469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.091523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.091925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.091948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.094659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.094718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.094772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.094826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.095441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.095498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.095551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.095604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.096055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.096078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.098474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.098532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.098583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.098636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.099002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.099058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.099111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.099173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.099496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.099519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.102748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.102808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.102863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.102921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.103290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.103351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.103404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.103456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.103772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.103794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.106196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.106255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.106307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.106352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.106921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.106979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.107034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.107087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.107598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.107622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.111405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.113151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.115137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.116533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.825 [2024-07-23 17:36:27.117615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.118117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.119804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.121657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.121987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.122010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.124830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.125335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.125855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.127175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.129519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.131487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.132942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.134659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.134991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.135014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.139970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.141879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.143843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.145095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.147452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.149399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.150350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.150847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.151387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.151411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.155018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.156753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.158606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.160149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.161168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.161665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.163330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.165184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.165509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.165532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.168439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.168944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.169456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.170758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.173110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.175078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.176594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.178330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.178660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.178683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.183546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.185554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.187523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.188787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.191117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.193088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.193852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.194350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.194889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.194922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.198670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.200427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.202418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.203748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.204827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.205329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.207175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.209041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.209367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.209390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.212270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.212770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.213297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.214735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.217086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.219048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.220580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.222367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.222695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.222717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.227599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.229549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.231510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.232756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.235105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.237067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.237901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.238398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.238961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:31.826 [2024-07-23 17:36:27.238984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.242620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.244350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.246339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.247700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.248777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.249278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.251119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.253088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.253415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.253438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.256162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.256678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.257187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.258688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.261027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.262966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.264555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.266349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.266683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.266706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.271520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.273381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.274878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.276737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.278960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.279502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.280003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.280504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.280982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.281007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.284721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.285234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.285731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.286235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.287203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.287703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.288203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.288702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.289209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.289232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.292866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.293374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.293872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.294379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.295339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.295838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.296342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.296844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.297443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.297468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.301035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.301530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.302035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.302551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.303578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.304085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.304599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.305106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.305621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.305644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.309243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.309748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.310263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.310783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.311756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.312262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.088 [2024-07-23 17:36:27.312766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.313276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.313843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.313867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.317483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.318002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.318502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.319004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.320020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.320524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.321027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.321523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.322024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.322054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.325619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.326132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.326632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.327132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.328165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.328665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.329165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.329660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.330134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.330158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.333816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.334324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.334825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.335328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.336348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.336848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.337347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.337845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.338338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.338361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.341949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.342451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.342955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.343456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.344494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.345001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.345502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.346016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.346568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.346598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.350012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.350510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.351014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.351514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.352609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.353112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.353627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.354136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.354657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.354680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.358258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.358761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.359267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.359765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.360769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.361874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.362795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.364469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.365030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.365052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.369583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.371254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.373095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.375071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.376950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.377467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.377966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.378694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.379073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.379095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.383111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.385084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.385592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.386095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.387743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.389489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.391425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.393291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.393686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.393708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.396434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.089 [2024-07-23 17:36:27.396935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.398780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.400746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.402603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.404430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.406403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.408367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.408848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.408871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.413628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.415596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.416856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.418593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.420897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.421642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.422140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.422652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.423085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.423107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.426848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.428803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.430485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.430987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.431951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.433725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.435588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.437558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.437951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.437976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.440462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.440970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.441793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.443538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.445820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.447079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.448798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.450761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.451099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.451122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.455851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.457831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.459812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.461334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.463650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.465594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.466104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.466600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.467165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.467189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.470727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.472599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.474552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.475792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.476854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.477359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.479072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.481041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.481368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.481390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.484273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.484777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.485295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.485349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.487479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.489437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.491300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.493060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.493426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.493448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.496366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.497985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.499827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.501806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.501867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.502283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.504146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.090 [2024-07-23 17:36:27.506123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.508090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.508956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.509510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.509533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.511997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.512713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.513103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.513126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.515314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.515373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.515428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.515481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.515983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.516044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.516098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.516154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.516208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.516743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.516766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.518737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.518796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.518848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.518909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.519841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.522590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.522651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.522704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.522758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.523672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.525674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.525731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.525783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.525836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.526939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.529489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.529547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.529607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.529665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.529991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.530059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.530118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.530171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.530223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.530673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.354 [2024-07-23 17:36:27.530696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.532688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.532746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.532798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.532851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.533372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.533434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.533489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.533563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.533619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.534198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.534222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.536927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.537250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.537272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.539802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.539864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.539924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.539985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.540464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.540532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.540586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.540638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.540689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.541065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.541088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.543818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.544266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.544291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.547965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.548286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.548310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.550339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.550406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.550471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.550529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.551776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.554891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.555300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.555323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.557643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.557701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.557754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.557809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.558333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.558395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.558449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.355 [2024-07-23 17:36:27.558505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.558559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.558995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.559018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.561745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.562073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.562096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.564838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.564905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.564968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.565957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.567949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.568720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.569299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.569323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.571788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.571856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.571921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.571974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.572294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.572362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.572415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.572468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.572520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.572994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.573018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.575928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.576473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.576500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.578468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.578527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.578584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.578638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.578967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.579044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.579110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.579169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.579221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.579545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.579567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.582995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.583380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.583402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.585392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.585450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.585502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.585562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.356 [2024-07-23 17:36:27.585879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.585955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.586011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.586064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.586116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.586553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.586576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.589495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.589562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.589622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.589679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.590580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.592609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.592667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.592726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.592787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.593302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.593364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.593419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.593473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.593527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.594018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.594042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.596349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.596416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.596471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.596524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.596920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.596993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.597049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.597106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.597158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.597558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.597581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.599882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.599950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.600800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.601218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.601240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.603992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.604312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.604335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.607883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.608214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.608237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.610978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.611556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.611581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.614065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.357 [2024-07-23 17:36:27.614134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.614787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.615235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.615259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.617287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.617346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.617842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.617906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.618454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.618517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.618578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.618632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.618686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.619107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.619131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.621203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.621260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.621312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.623282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.623831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.623910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.623966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.624019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.624072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.624593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.624617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.627948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.628449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.628963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.629468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.629980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.630484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.630991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.631492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.631996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.632512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.632535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.635852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.636364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.636867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.637386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.637877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.638384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.638883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.639400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.639907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.640462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.640486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.643768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.644280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.644784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.645290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.645875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.646388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.646890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.647408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.647909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.648429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.648453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.651622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.652140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.652641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.653139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.653662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.654175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.654680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.655184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.655694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.656161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.656186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.659499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.660019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.660518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.661020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.661527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.662055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.662553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.358 [2024-07-23 17:36:27.663054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.663553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.664047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.664070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.667464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.667977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.668473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.668972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.669403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.669918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.670418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.670923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.671421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.671955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.671981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.675347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.675848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.676349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.676849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.677377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.677884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.678387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.678883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.679391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.679938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.679961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.683239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.683745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.684247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.684745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.685275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.685781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.686284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.686783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.687298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.687833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.687856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.691112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.691612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.692121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.692642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.693173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.693675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.694177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.694675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.695185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.695696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.695719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.698978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.699477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.699983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.700485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.701057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.701562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.702069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.703937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.705084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.705513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.705536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.708950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.709446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.709952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.359 [2024-07-23 17:36:27.711182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.711546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.713523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.715478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.716849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.718584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.718916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.718939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.722434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.724181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.726169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.728028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.728410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.730150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.732128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.733846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.734359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.734843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.734867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.739124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.740622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.742363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.744372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.744715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.745244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.745740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.746241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.747985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.748312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.748334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.752245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.753325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.753824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.754330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.754849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.756617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.758595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.760563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.761813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.762232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.762255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.765095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.766058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.767790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.769754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.770086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.360 [2024-07-23 17:36:27.771344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.621 [2024-07-23 17:36:27.773077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.775047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.776798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.777321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.777345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.781857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.783833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.785133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.786873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.787205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.789189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.789698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.790212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.790711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.791050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.791073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.794711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.796701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.798018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.798526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.799012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.799521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.801377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.803352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.805324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.805751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.805774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.808208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.808709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.809552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.811274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.811600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.813590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.814839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.816587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.818551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.818879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.818908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.823588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.825542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.827507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.828773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.829140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.831131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.833086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.833594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.834096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.834651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.834675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.837858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.839607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.841508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.843182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.843660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.844185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.844687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.846350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.848214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.848540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.848561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.852193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.852695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.853197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.853768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.854101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.856051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.858014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.859262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.861012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.861337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.861358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.864281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.866037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.868010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.869976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.870433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.872189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.874166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.876082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.876584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.877115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.877141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.622 [2024-07-23 17:36:27.881480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.882751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.884496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.886440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.886767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.887449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.887955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.888463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.889923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.890282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.890304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.894203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.896067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.896569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.897069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.897578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.899145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.900959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.902941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.904208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.904535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.904557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.907192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.907695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.909553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.911531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.911855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.913243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.914981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.916948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.918922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.919362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.919385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.923909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.925874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.927134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.928874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.929207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.931179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.931966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.932462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.932966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.933434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.933457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.937121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.939092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.941071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.941579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.942097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.942601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.943958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.945692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.947544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.947872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.947900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.950258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.950767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.951272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.953108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.953436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.955367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.956819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.958567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.960546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.960872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.960898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.965335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.967191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.969156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.970412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.970750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.972757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.974721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.975674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.976183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.976672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.976695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.980901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.982535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.984359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.986350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.623 [2024-07-23 17:36:27.986720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.987237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.987733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.988233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.989983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.990311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.990333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.994220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.995190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.995251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.995746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.996284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.996942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:27.998692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.000656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.002625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.003071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.003092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.005595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.006100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.007500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.007560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.007936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.009906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.011754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.013535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.015388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.015715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.015742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.018397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.018456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.018512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.018565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.018891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.018960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.019021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.019076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.019133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.019457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.019479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.021463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.021520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.021571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.021623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.021946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.022014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.022068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.022129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.022184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.022688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.022711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.025984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.026342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.026368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.028306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.028364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.028441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.028497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.029779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.031946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.624 [2024-07-23 17:36:28.032671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.032998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.033021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.035822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.035881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.035946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.035999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.036967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.038978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.039699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.040203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.625 [2024-07-23 17:36:28.040226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.042999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.043986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.044493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.044515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.047309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.047366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.047424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.047487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.048714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.051555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.051614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.051668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.051730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.052951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.055710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.055769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.055823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.055875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.056402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.056464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.056518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.056573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.056627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.057124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.057147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.059936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.059994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.060859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.061330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.061353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.064931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.065024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.065541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.065564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.068350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.887 [2024-07-23 17:36:28.068432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.068491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.068545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.069782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.072560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.072618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.072672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.072725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.073951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.076777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.076835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.076899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.076955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.077366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.077437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.077493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.077547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.077610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.078114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.078137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.080977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.081946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.082452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.082475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.085309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.085367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.085420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.085476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.085954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.086025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.086078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.086133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.086186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.086705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.086728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.089464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.089537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.089591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.089644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.090854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.093820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.093879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.093953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.094021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.094548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.094618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.094672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.094726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.094780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.095301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.095325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.098379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.098437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.098491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.098545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.099020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.099095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.099150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.099203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.888 [2024-07-23 17:36:28.099257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.099765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.099788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.102556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.102617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.102670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.102723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.103936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.106856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.106935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.106989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.107042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.107522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.107592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.107649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.107703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.107755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.108302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.108325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.111972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.112027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.112576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.112599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.115447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.115504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.115556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.115609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.116863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.119830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.119905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.119973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.120050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.120514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.120584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.120639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.120694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.120747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.121317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.121340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.124949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.125003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.125056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.125590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.125613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.128475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.128533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.128591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.128643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.129879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.889 [2024-07-23 17:36:28.132703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.132760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.132825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.132905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.133424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.133505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.133569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.133623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.133675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.134213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.134237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.137967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.138448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.138471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.140845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.140907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.141424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.141487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.141979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.142059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.142113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.142168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.142222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.142748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.142771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.145592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.145651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.145704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.147140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.147518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.147586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.147639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.147692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.147752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.148078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.148101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.151506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.152024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.152526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.153194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.153542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.155552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.157510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.158771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.160507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.160832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.160854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.163758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.165518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.167500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.169479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.169920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.171678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.173654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.175628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.176135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.176655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.176677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.181149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.182417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.184168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.186131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.186457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.186972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.187473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.187979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.189618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.190000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.190022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.193884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.195739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.196248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.196750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.197300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.198692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.200351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.890 [2024-07-23 17:36:28.202207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.203794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.204127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.204149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.206807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.207316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.209155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.211125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.211450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.212961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.214729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.216695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.218663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.219171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.219193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.223634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.225628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.227047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.228783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.229114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.231092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.232321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.232822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.233322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.233844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.233870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.237628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.239601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.241575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.242098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.242614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.243124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.243958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.245698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.247671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.248007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.248030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.250641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.251154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.251660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.253124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.253488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.255478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.257357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.259011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.260867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.261199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.261222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.265118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.266868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.268756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.270379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.270706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.272456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.274380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.275832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.276353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.276819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.276841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.280885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.282601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.284451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.286427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.286830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.287346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.287853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.288356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.290147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.290474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.290497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.294455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.295461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.295977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.296478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.296998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.298746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.300719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.302686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.303939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.304304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:32.891 [2024-07-23 17:36:28.304328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.307134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.307959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.309699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.311637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.311970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.313246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.314989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.316931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.318797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.319292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.319315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.323837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.325815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.327246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.328970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.329300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.331197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.331701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.332205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.332702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.333053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.333076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.336744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.338665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.340125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.340641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.341115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.341623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.343472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.345352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.347317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.347711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.347734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.350234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.350740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.351440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.353178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.353503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.355489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.356736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.358470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.360431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.360756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.360779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.365499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.367485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.369456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.370746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.371156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.373138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.375106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.375609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.376118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.376662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.376685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.379966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.381713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.383655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.385488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.385992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.386499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.387007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.388520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.390249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.390577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.390599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.394384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.394891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.395397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.395900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.396228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.397985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.399976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.401328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.153 [2024-07-23 17:36:28.403062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.403389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.403416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.406407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.408257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.410233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.412208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.412635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.414451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.416416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.418389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.419057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.419579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.419602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.423910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.425181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.426926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.428903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.429229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.430181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.430679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.431181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.432365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.432723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.432745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.436732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.438589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.439101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.439601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.440141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.441448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.443194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.445048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.446604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.446935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.446958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.449501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.450007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.451853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.453759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.454091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.455744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.457592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.459560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.461530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.461955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.461978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.466421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.468349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.469806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.471540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.471867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.473862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.475103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.475605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.476109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.476615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.476639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.480580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.482405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.482913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.483414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.483956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.485451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.487202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.489199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.490525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.490861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.490883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.493552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.494062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.494563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.495072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.495595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.496109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.496612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.497117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.497636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.498093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.498115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.501377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.501881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.502382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.154 [2024-07-23 17:36:28.502885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.503414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.503925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.504426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.504930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.505432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.505919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.505941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.509283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.509790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.510298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.510807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.511296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.511803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.512307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.512810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.513311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.513813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.513838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.517126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.517639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.518145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.518648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.519128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.519633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.520139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.520642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.521148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.521692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.521715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.525038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.525544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.526049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.526549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.527047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.527553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.528057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.528575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.529081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.529668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.529697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.532983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.533489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.534066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.535530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.536060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.537458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.538117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.538616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.539122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.539447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.539470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.542824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.544049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.544110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.544711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.545210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.546895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.547394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.547896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.549101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.549518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.549540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.552746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.554556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.555066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.555126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.555512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.556587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.557092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.557596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.559410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.559991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.560015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.562638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.562712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.562785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.562840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.155 [2024-07-23 17:36:28.563222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.563286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.563339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.563391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.563449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.564043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.564066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.566609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.566679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.566745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.566813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.567279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.567343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.567397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.567449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.567502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.568056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.568080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.570705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.570775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.570841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.570937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.571387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.571453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.571512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.571565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.571617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.572172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.156 [2024-07-23 17:36:28.572196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.574752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.574822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.574885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.574959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.575453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.575524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.575578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.575630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.575682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.576164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.576187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.418 [2024-07-23 17:36:28.578697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.578756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.578808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.578873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.579392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.579466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.579520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.579572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.579625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.580036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.580059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.582897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.582968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.583764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.584103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.584126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.587865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.588201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.588225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.590980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.591890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.592219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.592241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.595820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.596151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.596174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.598607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.598677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.598742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.598796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.599349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.599409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.599462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.599516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.599568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.600000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.600023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.602937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.603863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.604414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.604442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.606910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.606967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.419 [2024-07-23 17:36:28.607641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.608133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.608155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.610239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.610297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.610350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.610404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.610921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.610986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.611060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.611126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.611180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.611719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.611742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.613684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.613744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.613797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.613851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.614763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.617289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.617348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.617402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.617456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.617984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.618048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.618100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.618153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.618205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.618560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.618582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.620609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.620667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.620723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.620775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.621761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.624747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.624806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.624864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.624924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.625828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.627839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.627902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.627955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.628016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.628538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.628607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.628662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.628717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.628772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.629293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.629317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.631733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.631791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.631853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.631915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.632828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.635044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.635102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.635162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.635222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.420 [2024-07-23 17:36:28.635729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.635800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.635856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.635916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.635970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.636440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.636463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.638529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.638586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.638639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.638691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.639637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.642995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.643048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.643101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.643423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.643445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.645514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.645572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.645623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.645685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.646805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.649990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.650481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.650504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.652580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.652638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.652691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.652745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.653251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.653316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.653370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.653436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.653491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.654052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.654076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.656785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.657111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.657133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.659648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.659706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.659758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.659812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.660333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.660393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.660451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.660504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.421 [2024-07-23 17:36:28.660557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.660916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.660939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.662971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.663701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.664079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.664103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.667762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.668087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.668110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.670973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.671485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.671509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.674104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.674162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.676988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.679539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.679597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.679652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.680505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.680878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.680953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.681007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.681061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.681115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.681437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.681459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.685383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.685891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.686396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.686899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.687248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.689005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.690855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.692414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.694274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.694599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.694622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.697769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.699629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.701549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.703525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.703951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.705805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.707779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.709746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.710615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.711178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.711217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.715644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.716916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.718665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.720626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.720956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.422 [2024-07-23 17:36:28.721642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.722144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.722643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.724205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.724556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.724578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.728366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.729954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.730455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.730955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.731465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.733317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.735229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.737197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.738462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.738837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.738865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.741798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.742740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.744368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.746337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.746661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.747949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.749691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.751667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.753536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.754127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.754150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.758629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.760477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.762246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.764098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.764422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.766199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.766695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.767193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.767688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.768018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.768042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.772051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.774027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.774704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.775204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.775773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.776625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.778365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.780336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.782189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.782592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.782614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.785305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.785809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.787655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.789554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.789877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.791587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.793432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.795407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.797391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.797832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.797855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.423 [2024-07-23 17:36:28.802458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.804431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.805708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.807441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.807766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.809759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.810646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.811145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.811641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.812113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.812136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.815899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.817874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.819708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.820210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.820695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.821207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.822902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.824752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.826724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.827139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.827162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.829681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.830187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.831077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.832817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.833146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.835131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.424 [2024-07-23 17:36:28.836394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.838136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.840109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.840435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.840458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.845186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.847155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.849012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.850791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.851149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.853125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.854865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.855371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.855870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.856387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.856412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.860234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.862209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.864196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.865008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.865541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.866051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.866795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.868099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.869709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.870042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.870065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.873630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.874143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.875812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.876317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.876642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.878412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.880391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.881716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.883453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.883780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.883803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.886815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.888563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.890571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.892547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.893078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.894929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.896913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.898830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.899544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.900103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.900128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.903496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.904010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.904512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.905020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.905539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.906052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.906554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.907066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.907569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.908059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.908083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.911339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.911843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.912359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.912858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.913376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.913881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.914387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.914899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.915398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.915950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.915973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.919263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.919779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.686 [2024-07-23 17:36:28.920306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.920803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.921343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.921848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.922360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.922859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.923363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.923866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.923891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.927139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.927643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.928149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.928646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.929179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.929682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.930191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.930691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.931194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.931695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.931717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.935122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.935634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.936142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.936638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.937151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.937681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.938189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.938686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.939192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.939671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.939694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.943142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.943652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.944156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.944652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.945088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.945595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.946106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.946601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.947108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.947570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.947592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.951069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.951584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.952086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.952586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.953110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.953617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.954124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.954619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.955124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.955647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.955670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.959104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.959606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.960130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.960631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.961079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.961585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.962089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.962589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.963100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.963640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.963662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.966982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.967489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.967993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.968499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.969062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.969567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.970069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.970567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.971084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.971627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.971651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.687 [2024-07-23 17:36:28.975033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.975554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.976715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.977590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.977921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.978431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.978935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.979433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.979938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.980408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.980430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.983796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.984309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.984805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.986490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.986902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.988884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.990722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.992588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.994504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.994830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.994853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:28.999254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.001111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.003101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.004349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.004692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.006714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.008685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.009613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.010115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.010651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.010674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.014043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.015807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.017775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.019625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.020140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.020643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.021159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.022696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.024458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.024787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.024810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.028364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.028878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.029381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.030093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.030455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.032451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.034416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.035675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.037408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.037733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.037761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.041196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.042944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.044919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.046770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.047166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.048905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.050870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.052732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.053238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.053734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.053758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.057840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.059702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.059765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.061743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.062077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.063513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.064019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.064514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.065393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.065756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.065778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.069805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.071767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.072276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.072336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.688 [2024-07-23 17:36:29.072851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.073361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.074776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.076509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.078479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.078808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.078831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.080884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.080953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.081809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.082336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.082361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.084445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.084510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.084565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.084618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.084954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.085020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.085081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.085135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.085192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.085516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.085539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.088952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.089005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.089378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.089400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.091427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.091486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.091556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.091608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.091932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.092000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.092054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.092106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.092158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.092667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.092692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.095446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.095505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.095557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.095608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.095935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.096008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.096061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.096115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.096175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.096498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.096520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.098535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.098594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.098664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.098723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.099286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.099348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.099403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.099456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.099510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.100025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.100050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.102946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.103005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.103329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.689 [2024-07-23 17:36:29.103352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.105971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.106886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.107253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.107276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.109401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.951 [2024-07-23 17:36:29.109458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.109510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.109562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.109882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.109961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.110024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.110080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.110131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.110518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.110541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.113440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.113505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.113564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.113617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.113944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.114012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.114066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.114119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.114173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.114495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.114517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.116625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.116692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.116748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.116802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.117344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.117405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.117460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.117514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.117569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.118093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.118117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.120326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.120384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.120441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.120495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.120934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.121003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.121056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.121110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.121162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.121529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.121551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.124970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.125293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.125315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.127392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.127463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.127515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.127567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.127887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.127962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.128016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.128079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.128137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.128463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.128486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.131383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.131441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.131500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.131556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.131876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.131960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.132016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.132069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.132122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.132444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.132466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.134567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.134626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.134689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.134743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.135261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.135322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.135376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.135431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.135485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.952 [2024-07-23 17:36:29.135978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.136002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.140547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.140614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.140666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.140719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.141639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.146991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.147044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.147096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.147467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.147489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.152655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.152717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.152769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.152821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.153787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.159607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.159670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.159735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.159790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.160371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.160437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.160491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.160548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.160602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.160992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.161016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.166345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.166407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.166460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.166513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.166978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.167068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.167125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.167179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.167232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.167756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.167779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.172456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.172519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.172576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.172637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.172967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.173038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.173097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.173150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.173202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.173525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.173552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.177812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.177876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.177934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.177995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.178901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.183735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.183803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.183860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.183920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.184273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.184341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.953 [2024-07-23 17:36:29.184394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.184447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.184499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.184818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.184840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.191995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.192055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.192565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.192588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.197534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.197595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.197658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.197711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.198905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.203678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.203741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.203793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.203845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.204793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.209556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.209623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.209680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.209732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.210757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.215422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.215486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.215542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.215595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.215922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.215993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.216054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.216113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.216166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.216484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.216506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.222722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.222785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.222839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.222900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.223425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.223486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.223541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.223594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.223648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.224179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.224207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.228804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.228868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.229370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.229433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.229903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.229967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.230021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.230076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.230129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.230632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.230656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.232713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.232773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.232825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.234552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.954 [2024-07-23 17:36:29.234880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.234956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.235009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.235070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.235126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.235451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.235474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.240057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.241939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.243501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.245283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.245637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.247653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.249385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.249883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.250383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.250929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.250954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.254283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.254791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.255316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.255810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.256352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.256856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.257366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.257861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.258359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.258816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.258839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.262194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.262703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.263208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.263705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.264166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.264673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.265192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.265689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.266194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.266724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.266749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.270402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.270922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.271418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.271924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.272500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.273036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.273534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.274035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.274535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.275051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.275075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.278423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.278933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.279430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.279934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.280447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.280960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.281457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.281958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.282461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.283009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.283033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.286305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.286806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.287312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.287813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.288359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.288864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.289366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.289866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.290407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.290942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.290967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.294476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.294980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.295499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.296008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.296541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.297051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.297549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.298057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.298555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.299083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:33.955 [2024-07-23 17:36:29.299105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.410115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.410520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.413950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.414955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.415016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.416699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.417934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.418191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.418260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.419987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.420049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.421764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.421824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.422191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.422585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.423003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.423023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.423039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.427966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.429348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.430587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.431824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.432085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.433620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.435153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.436194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.436593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.437024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.437044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.437061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.440330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.441841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.443600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.445324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.445701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.447328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.448568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.450099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.451619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.451876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.451900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.451915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.456884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.458441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.459968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.461502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.461759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.462596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.463963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.465721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.467332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.467591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.467609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.467624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.470156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.470551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.472059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.473289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.473545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.475257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.476788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.477545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.479311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.479623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.479642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.479656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.484309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.484707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.485375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.486913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.217 [2024-07-23 17:36:29.487171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.488710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.490217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.491860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.492814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.493078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.493098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.493114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.495105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.495500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.495889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.496286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.496717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.497474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.498924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.500685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.502207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.502471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.502489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.502504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.508689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.509094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.509486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.509877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.510367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.510765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.512198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.513431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.514966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.515224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.515242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.515257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.518555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.520318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.521900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.522519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.523001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.523405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.523794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.524192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.524999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.525257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.525275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.525291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.531350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.532888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.534424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.535879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.536275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.536673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.537069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.537461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.537850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.538145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.538164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.538179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.541223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.542219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.543449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.545182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.545442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.546989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.548182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.548575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.548972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.549351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.549370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.549384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.554102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.554996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.556729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.557969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.558226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.559988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.561622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.562191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.562584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.563006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.563030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.563047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.566244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.567763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.569312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.218 [2024-07-23 17:36:29.570844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.571235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.572867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.574475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.576019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.577550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.577808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.577827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.577843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.582011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.583645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.585407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.587038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.587420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.588975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.590170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.591683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.593236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.593496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.593514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.593529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.595930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.597689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.598093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.598483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.598752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.600001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.601715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.603398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.603980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.604239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.604258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.604273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.609331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.609732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.610129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.611064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.611321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.612722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.614263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.615324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.616713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.617021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.617040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.617055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.618814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.619214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.619265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.619647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.620048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.620475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.620495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.620898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.621295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.621346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.621726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.622127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.622523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.622542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.622557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.622572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.625938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.625994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.626380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.626427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.626848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.626867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.627279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.627332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.627736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.627786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.628236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.628255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.628270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.628286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.631026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.631081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.631474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.631524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.631995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.632013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.632407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.632454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.632842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.219 [2024-07-23 17:36:29.632887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.220 [2024-07-23 17:36:29.633309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.220 [2024-07-23 17:36:29.633329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.220 [2024-07-23 17:36:29.633350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.220 [2024-07-23 17:36:29.633366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.220 [2024-07-23 17:36:29.636829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.220 [2024-07-23 17:36:29.636887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.637285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.637350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.637846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.637865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.638269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.638321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.638713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.638759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.639179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.639199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.639217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.639235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.641959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.642014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.642401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.642447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.642870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.642890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.643289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.643339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.643726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.643772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.644116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.644136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.644152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.644168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.647889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.647957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.648348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.648400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.648765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.648784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.649184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.649231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.649619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.649663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.650118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.650139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.650155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.650172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.652671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.652724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.653118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.653164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.653628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.653647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.654968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.658382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.658442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.658827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.658879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.659315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.659336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.659726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.659772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.660167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.660214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.660592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.660611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.660627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.660642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.482 [2024-07-23 17:36:29.663459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.663510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.663904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.663951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.664370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.664389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.664791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.664844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.665255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.665307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.665779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.665799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.665815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.665832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.669364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.669421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.669810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.669856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.670364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.670384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.670781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.670828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.671228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.671281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.671606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.671626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.671641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.671657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.674359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.674411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.674802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.674854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.675210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.675231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.675629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.675677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.676068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.676116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.676490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.676509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.676523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.676539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.679947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.680002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.680393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.680438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.680867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.680887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.681294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.681344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.681744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.681793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.682256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.682276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.682291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.682308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.685065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.685127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.685517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.685587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.686068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.686088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.686482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.686529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.686919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.686967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.687374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.687395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.687410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.687426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.690837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.690900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.691292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.691343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.691809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.691828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.692239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.692292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.692676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.692721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.693149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.693177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.693193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.693209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.483 [2024-07-23 17:36:29.696003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.696055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.696441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.696487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.696907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.696926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.697326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.697374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.697763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.697810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.698222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.698243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.698258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.698273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.701836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.701899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.702293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.702344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.702743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.702761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.703159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.703207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.703597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.703642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.704084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.704105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.704121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.704138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.706687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.706738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.706782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.706825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.707235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.707254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.707646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.707693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.707737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.707779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.708138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.708157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.708172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.708188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.711459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.711510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.711552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.711595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.711931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.711951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.712616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.714973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.715748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.716137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.716156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.716172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.716188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.719448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.719511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.719553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.719596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.719993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.720650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.723158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.484 [2024-07-23 17:36:29.723218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.723794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.724204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.724223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.724239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.724255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.727619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.727674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.727716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.727756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.728516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.730503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.730551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.730593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.730635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.731758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.735977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.736030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.736075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.736330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.736349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.736363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.736379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.737860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.737910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.737955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.737995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.738444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.738463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.738533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.738576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.738622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.738664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.739128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.739148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.739164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.739182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.743765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.744025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.744045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.744059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.744075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.745803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.745849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.485 [2024-07-23 17:36:29.745889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.745936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.746685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.749670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.749721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.749762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.749803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.750607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.752215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.752280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.752600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.752644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.752936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.752956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.853137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.853514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.862559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.862627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.864119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.864166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.864219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.865423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.865837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.865861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.872251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.873214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.873262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.873319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.875050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.875356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.875374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.875389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.879640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.880038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.880087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.880474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.880842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.880906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.882173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.882220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.882273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.883849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.884114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.884133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.884148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.884164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.884178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.888559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.890112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.890163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.890770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.891295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.891683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.891729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.892130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.892540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.892560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.892576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.892592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.892607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.896696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.897667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.897716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.898942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.899232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.900758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.486 [2024-07-23 17:36:29.900806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.902333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.902654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.902674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.902689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.902705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.902720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.905594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.906812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.908342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.909882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.910180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.910757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.912419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.914009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.914267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.914286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.914301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.914320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.914335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.918706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.919979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.921211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.922748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.924532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.925594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.927129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.928359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.928618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.928637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.928652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.928667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.928682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.933173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.933856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.935384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.937119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.938914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.940566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.941509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.942769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.943032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.943050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.943065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.943080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.943095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.947317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.947715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.748 [2024-07-23 17:36:29.949240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.950466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.952376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.953912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.954703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.956464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.956774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.956793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.956808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.956824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.956838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.961272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.961678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.963223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.964428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.966297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.967836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.968637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.970392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.970705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.970724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.970739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.970754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.970768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.975407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.975805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.976658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.978017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.979793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.981350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.982821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.983986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.984256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.984276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.984292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.984309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.984324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.989590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.989999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.990390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.992152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.994001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.995769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.997536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.998110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.998368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.998387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.998402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.998417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:29.998432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.003912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.004312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.004705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.006045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.008127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.009663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.011205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.012116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.012375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.012394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.012409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.012425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.012444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.017266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.017677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.018078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.018471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.020191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.021403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.022926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.024624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.749 [2024-07-23 17:36:30.024882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.024905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.024921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.024937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.024952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.029470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.029872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.030268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.030660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.031457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.033217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.034587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.036114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.036371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.036389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.036404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.036420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.036435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.042817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.043224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.043617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.044020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.044875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.046177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.047408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.048917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.049175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.049192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.049207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.049222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.049237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.055108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.056028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.056423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.056813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.057579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.058000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.059753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.061214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.061473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.061490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.061505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.061520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.061535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.067687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.069385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.069779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.069827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.070658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.071836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.077349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.077406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.078936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.078983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.080735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.080785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.081430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.081477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.081974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.081994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.082011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.082028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.082044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.087033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.087090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.088605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.088653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.089943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.089995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.091568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.096065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.096123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.750 [2024-07-23 17:36:30.096742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.096788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.098260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.098311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.099834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.099881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.100141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.100160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.100176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.100192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.100207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.106497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.106904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.106954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.107343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.108992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.109007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.114108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.114167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.115697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.117409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.118222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.118273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.118657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.118719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.119213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.119233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.119249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.119266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.119282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.121930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.122870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.123270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.123320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.124128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.124520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.124568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.124967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.125349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.125368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.125384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.125400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.125415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.129037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.129440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.129492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.129884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.130695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.130743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.131982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.135639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.135703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.136101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.136499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.136963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.137356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.137743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.137789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.138215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.138234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.138250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.138265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.138280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.141404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.141805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.142212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.142283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.143151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.143540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.143588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.144001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.144437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.144457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.751 [2024-07-23 17:36:30.144473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.144491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.144506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.147849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.148438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.148489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.150000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.151022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.151086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.152582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.154112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.154385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.154404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.154420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.154436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.154451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.157850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.157915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.158302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.158695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.159110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.159504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.159907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.159966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.160381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.160401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.160417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.160432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.160447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.163686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.164096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.164494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.164562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.165429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.165829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.165875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.166266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.166668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.166685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.166700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.166714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:34.752 [2024-07-23 17:36:30.166729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.170161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.170569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.170620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.171024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.171925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.171978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.172366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.172754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.173167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.173185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.173200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.173215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.173229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.176760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.176846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.177247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.177659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.178183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.178571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.178966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.179355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.179759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.179780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.179795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.179809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.179824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.183093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.183495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.183900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.183942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.184345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.184736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.184781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.185176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.185683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.185701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.185716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.185731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.185746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.188837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.189238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.189287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.189674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.190399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.190451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.190837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.191240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.191627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.191644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.191659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.191674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.191687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.194757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.195164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.195212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.195602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.196050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.196444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.196840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.196890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.197299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.197316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.197331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.197346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.197361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.200322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.200721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.200767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.201161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.201925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.201976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.015 [2024-07-23 17:36:30.202948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.206285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.206701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.206759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.207175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.208051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.208106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.208495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.208539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.208973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.208992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.209007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.209022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.209037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.212050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.212101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.212143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.212184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.212979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.213505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.216628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.216678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.216733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.216775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.217942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.221789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.222188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.222207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.222223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.222238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.222252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.225944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.226386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.226404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.226420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.226435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.226450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.229934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.230347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.230368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.230383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.230398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.230413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.233528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.233579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.233622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.016 [2024-07-23 17:36:30.233664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.234568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.238984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.239844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.243492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.243542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.243583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.243624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.244625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.248721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.248775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.248818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.248859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.249708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.254801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.255219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.255237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.255253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.255268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.255283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.260697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.261014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.261030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.261045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.261059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.261073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.264901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.264953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.265000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.265043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.265481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.017 [2024-07-23 17:36:30.265524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.265943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.269600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.269655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.271186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.271235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.271571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.271616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.271662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.273291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.273600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.273616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.273630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.273644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.273659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.278336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.278392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.278779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.278824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.279184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.280532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.280582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.282316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.282575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.282593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.282607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.282621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.282635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.288549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.288608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.289377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.289443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.289729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.290526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.290572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.290970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.291317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.291334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.291348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.291362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.291377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.298350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.298427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.300185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.300238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.300694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.302383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.302430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.304209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.304465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.304483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.304497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.304515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.304529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.308662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.309069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.309118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.309159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.309440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.310669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.310717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.312224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.312481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.312498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.312512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.312526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.312540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.319125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.319185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.319228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.320357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.320659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.018 [2024-07-23 17:36:30.321793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.321807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.325470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.325522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.327059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.327109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.327444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.327489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.328878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.328934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.329248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.329265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.329279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.329293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.329307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.333714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.334120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.334168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.334211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.334659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.335466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.338800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.338857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.338908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.340582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.341329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.341378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.341421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.342358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.342620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.342637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.342652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.342666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.342680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.346891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.346950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.348479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.348526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.348810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.348864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.350625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.350679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.351124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.351144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.351159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.351173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.351186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.353959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.355492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.355543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.355587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.355871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.357479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.357527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.357568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.357988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.358005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.358019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.358033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.358052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.364350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.364410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.364452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.364841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.366597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.366645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.366686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.367229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.367644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.367661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.367677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.367692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.367707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.371819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.371870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.373066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.373112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.019 [2024-07-23 17:36:30.373468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.373511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.375546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.379867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.380827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.380876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.380926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.381312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.383483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.389342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.389399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.389440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.389477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.391260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.391310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.020 [2024-07-23 17:36:30.391351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.391392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.391814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.391831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.391845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.391859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.391874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.395599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.395656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.395698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.397231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.398761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.399027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.399671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.399718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.401752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.405697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.406099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.406145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.406530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.406785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.406836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.408460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.413428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.414978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.415028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.416595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.416965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.418723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.418776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.418823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.419220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.419639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.419660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.419676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.419691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.419705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.424254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.425790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.425840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.427368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.427625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.427681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.428846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.428901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.430119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.430375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.430392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.430406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.430420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.021 [2024-07-23 17:36:30.430433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.434587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.434989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.435036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.435758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.436021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.436073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.437389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.437436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.438970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.439226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.439243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.439257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.439275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.439289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.445693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.446431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.447925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.448316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.448737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.448788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.450129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.451882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.458268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.459762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.461302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.462847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.463115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.463517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.463915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.464305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.464694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.465131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.465150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.465165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.465180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.465198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.283 [2024-07-23 17:36:30.471323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.472560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.474097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.475808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.476078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.477015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.478558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.479247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.479641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.480013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.480030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.480045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.480059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.480073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.486756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.488295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.489076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.490818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.491146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.492694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.494418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.496186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.496580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.497065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.497086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.497102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.497117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.497132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.502171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.503709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.504924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.506286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.506604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.508373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.510015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.511523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.512381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.512639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.512655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.512669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.512683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.512697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.518810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.520043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.521701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.523462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.523720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.524912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.526347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.527576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.529117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.529375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.529392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.529406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.529420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.529434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.534638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.536064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.537826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.539390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.539647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.541136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.542291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.543522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.545146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.545403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.545419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.545433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.545447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.545461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.551814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.552350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.552742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.553962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.554247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.555927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.557464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.559003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.560072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.560354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.560371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.560385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.560399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.284 [2024-07-23 17:36:30.560413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.563326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.563721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.564116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.564505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.564993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.565386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.566948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.568163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.569681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.569954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.569971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.569985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.569999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.570013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.573196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.574909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.576672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.577150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.577406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.578149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.578541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.579403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.580758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.581247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.581265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.581280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.581296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.581313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.584686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.586225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.587034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.588796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.589128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.590655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.592419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.594181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.594631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.595129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.595148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.595169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.595184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.595199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.598988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.600715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.602247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.603792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.604078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.605112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.606341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.608061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.609820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.610088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.610105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.610119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.610133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.610147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.612482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.612533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.614293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.614346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.614860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.615262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.616545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.617774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.617822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.618086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.618103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.618117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.618132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.618145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.620940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.620991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.622519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.622566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.622820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.624347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.624394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.625033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.625080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.625575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.625592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.285 [2024-07-23 17:36:30.625607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.625621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.625636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.629465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.629535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.631297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.631349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.631604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.633147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.633195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.634613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.637398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.637449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.638814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.639678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.640122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.640516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.640566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.642950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.644736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.646288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.647813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.647860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.648174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.649458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.649505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.650725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.650770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.651033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.651050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.651064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.651078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.651092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.653237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.653629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.653674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.654069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.654443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.656220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.656274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.657907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.657953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.658208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.658224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.658238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.658252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.658266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.661379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.661431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.662964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.663886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.664149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.665432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.665823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.665868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.666322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.666578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.666595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.666609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.666623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.666637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.668740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.670523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.672050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.672096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.672426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.673777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.673826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.675051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.676587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.676845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.676861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.676875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.286 [2024-07-23 17:36:30.676889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.676909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.679612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.680005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.680051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.680439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.680830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.680888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.681292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.683638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.686060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.686111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.686507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.686908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.687310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.687707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.688097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.688144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.688526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.688963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.688985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.689000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.689015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.689029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.690899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.691290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.691678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.691728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.691990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.692915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.692962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.693348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.693739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.694210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.694227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.694242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.694257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.694271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.697082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.697484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.697535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.697933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.698399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.698458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.700203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.700599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.700645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.701079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.701098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.701113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.701128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.287 [2024-07-23 17:36:30.701148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.703566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.703618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.704016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.704408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.704827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.705229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.705619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.705666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.706061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.706456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.706472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.706487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.706502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.706516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.708544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.708946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.710696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.710744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.711259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.711658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.711704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.712322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.713924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.714317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.714334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.714349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.714363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.714377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.716832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.717240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.717962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.718011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.718264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.718320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.719854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.721385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.722322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.722582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.722598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.722612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.722626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.722640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.725520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.726444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.726492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.726532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.726957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.727007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.727648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.727693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.729112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.729605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.729623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.729638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.729653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.729668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.551 [2024-07-23 17:36:30.732213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.732266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.732653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.732697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.733170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.733567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.733612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.734823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.737825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.737877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.738265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.738310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.738712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.738772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.739188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.739597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.739645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.740133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.740151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.740166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.740181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.740195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.742941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.743001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.743503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.743551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.743803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.744355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.744408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.744795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.744838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.745116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.745133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.745147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.745161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.745174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.747667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.747721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.748114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.748159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.748582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.748981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.749945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.751798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.751842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.751883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.751931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.752356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.752751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.752797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.752837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.752878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.753141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.753158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.753172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.753186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.753201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.755558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.755604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.755646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.755687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.552 [2024-07-23 17:36:30.756739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.756754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.758983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.759982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.760000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.760015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.760028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.762834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.763244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.763261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.763276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.763290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.763305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.765595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.765641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.765684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.765726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.766779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.768803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.768846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.768886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.768937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.769446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.769491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.769533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.769575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.769619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.770048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.770064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.770079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.770094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.770117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.772997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.773470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.773486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.773501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.773516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.773530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.776742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.777066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.777084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.777098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.777113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.553 [2024-07-23 17:36:30.777127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.779824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.780318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.780337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.780352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.780367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.780383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.782579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.782624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.782665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.782705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.783789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.786735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.787087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.787104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.787119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.787134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.787149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.789925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.790352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.790369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.790384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.790398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.790413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.792702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.793112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.793159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.793551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.793927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.793987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.794792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.796792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.797471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.797519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.797908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.798296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.798366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.798760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.798822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.799243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.799751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.554 [2024-07-23 17:36:30.799773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.799789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.799804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.799820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.802243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.802646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.802709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.803421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.803677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.803727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.804884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.807041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.807434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.807484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.807525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.807981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.808029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.808418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.808462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.808850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.809288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.809305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.809321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.809337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.809356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.812409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.812458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.812500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.812886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.813281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.813331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.815980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.818220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.818266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.818650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.818693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.819171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.819221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.819608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.819651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.821044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.821303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.821320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.821335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.821349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.821363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.823309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.823709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.823754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.823796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.824061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.824112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.824154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.824981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.825026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.825461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.825479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.825496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.825512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.825527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.827511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.827563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.827604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.829085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.829495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.829546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.831162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.831207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.831249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.831713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.831731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.555 [2024-07-23 17:36:30.831746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.831764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.831779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.833818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.833863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.834700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.834746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.835006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.836483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.836530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.836571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.838096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.838353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.838369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.838383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.838397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.838411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.840039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.841564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.841611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.841651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.841908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.841964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.842004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.843392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.846164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.846213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.846256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.846642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.846965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.847017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.848712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.850299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.850343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.851569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.851617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.851869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.853411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.853459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.853500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.855092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.855510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.855528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.855543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.855557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.855572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.857634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.859375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.859423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.859465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.859939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.859986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.860029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.860414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.860461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.860718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.860734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.556 [2024-07-23 17:36:30.860748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.860762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.860776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.863301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.863351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.863413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.865173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.865484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.865539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.867465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.869203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.869248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.869295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.869684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.869994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.871497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.871542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.871582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.871623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.872058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.872076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.872092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.872111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.872127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.873793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.873837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.875376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.877129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.877562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.879324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.879378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.880863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.880913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.881167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.881183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.881198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.881212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.881226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.882852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.883252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.883299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.883687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.883971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.884023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.885828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.887518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.889105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.889151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.890845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.891275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.893032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.893084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.893130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.894891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.895150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.895166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.895180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.895195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.895208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.896803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.897208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.897253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.897640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.897910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.897962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.557 [2024-07-23 17:36:30.899907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.899921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.901399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.903166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.903211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.904028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.904283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.904332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.905732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.905778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.907317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.907574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.907590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.907604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.907618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.907632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.909778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.910829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.912015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.912406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.912836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.912887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.914649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.915861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.917395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.917649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.917666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.917680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.917694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.917708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.920855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.922553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.924311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.924748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.925008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.925763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.926158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.926969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.928380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.928850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.928867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.928883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.928904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.928919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.931038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.932795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.934192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.935000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.935290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.936705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.937100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.937489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.939239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.939633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.939649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.939664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.939679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.939694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.942046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.943804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.944204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.945965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.946398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.946796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.947564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.949903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.952825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.953701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.955408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.956896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.957151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.958838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.960603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.961099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.962825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.963305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.963325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.558 [2024-07-23 17:36:30.963340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.559 [2024-07-23 17:36:30.963355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.559 [2024-07-23 17:36:30.963369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.559 [2024-07-23 17:36:30.965763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.559 [2024-07-23 17:36:30.967276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.559 [2024-07-23 17:36:30.968504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.970041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.970297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.971833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.972628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.974392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.975608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.975863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.975879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.975905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.975920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.975934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.978459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.978854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.979482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.981079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.981586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.981984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.983413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.984804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.986330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.986588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.986605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.986619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.986633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.986647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.989787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.991333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.992969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.993735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.993994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.994444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.994833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.995938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.997066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.997530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.997549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.997564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.997579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:30.997598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.000999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.002598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.003218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.004973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.005301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.006848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.008487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.010244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.010728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.010989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.011006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.011020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.011034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.011048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.013481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.013870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.015309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.016528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.016782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.018383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.019918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.020786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.022501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.022811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.022827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.022841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.022856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.022870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.025461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.026521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.026921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.027454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.027708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.028388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.028781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.029834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.031059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.031316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.031333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.031347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.031361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.031375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.034496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.822 [2024-07-23 17:36:31.036214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.037744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.039282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.039540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.040344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.041759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.042861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.046055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.046106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.047632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.047679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.047943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.049706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.050514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.051917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.051968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.052221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.052237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.052251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.052266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.052280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.055587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.055643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.056037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.056082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.056499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.057772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.057819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.058594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.058638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.059064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.059082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.059097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.059112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.059127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.062197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.062249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.063900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.063946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.064343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.065932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.065978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.067732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.067777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.068042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.068059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.068074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.068088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.068102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.069992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.070043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.070431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.072028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.072368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.072770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.072817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.073966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.075394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.076065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.077584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.077647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.077912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.079444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.079493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.081379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.084909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.823 [2024-07-23 17:36:31.085714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.085760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.086155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.086493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.088120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.088168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.089913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.089957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.090216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.090232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.090247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.090262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.090275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.093341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.093391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.094930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.096461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.096722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.097564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.098939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.098984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.099372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.099790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.099808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.099823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.099842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.099857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.101865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.103320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.105089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.105134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.105388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.106937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.106985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.108166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.109546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.109858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.109874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.109889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.109908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.109923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.112119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.113512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.113559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.113952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.114366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.114415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.116055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.116657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.116702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.117135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.117153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.117168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.117183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.117198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.120215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.120269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.121759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.122851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.123118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.124693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.126222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.126269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.127783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.128067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.128084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.128099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.128115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.128129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.130177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.131704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.132401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.132447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.132868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.133609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.133657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.134960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.136717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.136983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.137000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.137014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.824 [2024-07-23 17:36:31.137028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.137042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.140095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.141625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.141673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.143203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.143467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.143525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.144549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.145749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.145793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.146236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.146254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.146269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.146284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.146301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.149080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.149130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.150384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.152142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.152398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.153951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.155304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.155350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.156760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.157074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.157091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.157105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.157119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.157133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.158546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.158947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.159340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.159385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.159763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.161528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.161577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.161976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.162365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.162644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.162660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.162674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.162689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.162702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.165650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.166770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.167999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.168046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.168301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.168349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.169975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.171504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.172353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.172759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.172777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.172791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.172806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.172820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.175216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.176495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.176543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.176584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.176885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.176946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.178473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.178520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.180148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.180402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.180418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.180432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.180447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.180461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.183854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.183910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.185667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.185711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.186134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.186553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.186600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.186993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.188463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.188773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.188790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.188805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.188821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.825 [2024-07-23 17:36:31.188835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.192285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.192335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.193867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.193917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.194168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.194215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.194907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.196787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.198725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.198775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.199170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.199217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.199468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.200272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.200318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.200706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.200749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.201070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.201087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.201101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.201115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.201129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.204295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.204351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.205365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.205411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.205676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.207327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.207383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.208951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.208997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.209252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.209268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.209283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.209297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.209315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.211464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.211509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.211550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.211590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.211841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.212868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.214922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.215289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.215306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.215320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.215334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.215348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.216822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.216866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.216917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.216958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.826 [2024-07-23 17:36:31.217208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.217807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.219908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.219955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.219998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.220980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.222460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.222505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.222546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.222586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.222950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.223452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.224861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.224911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.224952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.224992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.225433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.225495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.225541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.225582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.225623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.226099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.226117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.226135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.226151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.226166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.228787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.229099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.229116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.229130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.229144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.229158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.230574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.230620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.230660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.230702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.230964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.231017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.231058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.231098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.231138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.231454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.827 [2024-07-23 17:36:31.231470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.231484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.231498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.231512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.233796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.234222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.234243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.234259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.234274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.234289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.236914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.237168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.237184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.237198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.237212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.237226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.238786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.238830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.238870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.238945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:35.828 [2024-07-23 17:36:31.239689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.241915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.241960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.242995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.243009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.245009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.246765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.246811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.248464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.248787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.248845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.248887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.248938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.249344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.249782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.249799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.249814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.249832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.249846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.251905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.252307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.252356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.252748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.253134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.253188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.253575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.253619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.254694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.254954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.254972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.254986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.255001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.255014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.257347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.257738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.257785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.258178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.258523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.258574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.260131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.260177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.260563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.091 [2024-07-23 17:36:31.261000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.261017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.261033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.261048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.261062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.263264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.264837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.265235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.265581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.265598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.265613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.265628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.265642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.269520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.269572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.269613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.270002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.270405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.270456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.270846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.270900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.271292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.271694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.271711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.271726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.271741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.271756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.273798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.273844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.274242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.274301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.274708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.274766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.275173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.275223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.275610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.276045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.276065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.276080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.276096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.276111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.278223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.278620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.278670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.278711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.279070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.279130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.279173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.279561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.279607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.280040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.280059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.280074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.280089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.280104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.283248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.283299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.283340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.284864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.285126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.285185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.286802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.286848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.286899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.287278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.287294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.287308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.287322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.287336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.288776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.288820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.289220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.289266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.289680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.290664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.290712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.290753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.291659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.092 [2024-07-23 17:36:31.292105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.292122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.292138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.292153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.292171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.293819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.295516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.295562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.295606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.295956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.296008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.296049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.297633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.300783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.300835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.300875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.301553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.301988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.302039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.302432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.302487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.302556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.303030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.303047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.303062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.303077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.303091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.305194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.305237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.305782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.305828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.306266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.306661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.306723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.306777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.307176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.307645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.307662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.307676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.307691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.307706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.309840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.310240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.310285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.310327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.310671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.310729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.310772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.311623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.314100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.314150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.314205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.314596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.314953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.315987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.318047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.318092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.318157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.318570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.093 [2024-07-23 17:36:31.318943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.319962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.322106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.322153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.322547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.322946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.323434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.323986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.324035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.325533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.325578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.326035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.326053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.326069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.326087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.326102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.328263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.328656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.328701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.329997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.330259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.330321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.330708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.330751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.330793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.331203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.331220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.331234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.331249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.331264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.333570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.335107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.335155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.335679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.336103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.336501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.336559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.336630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.337034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.337499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.337516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.337530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.337545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.337558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.339786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.340184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.340230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.340621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.341023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.341092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.341487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.341539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.341934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.342251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.342267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.342281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.342296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.342310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.344494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.344891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.344944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.345333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.345776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.345829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.347032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.347080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.347931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.348377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.348395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.348410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.348426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.348441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.350950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.352416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.353185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.353574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.094 [2024-07-23 17:36:31.353956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.354015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.354413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.354808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.355222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.355647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.355668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.355683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.355697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.355711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.358207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.358608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.359019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.359540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.359796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.360490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.360883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.361280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.361672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.362137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.362154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.362169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.362183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.362197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.364754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.365153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.365549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.365951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.366433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.367087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.368657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.369861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.373060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.373453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.373845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.374272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.374609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.375017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.375416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.377180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.377618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.378072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.378090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.378106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.378121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.378136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.380756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.382513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.382915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.383300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.383628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.384036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.384458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.384850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.385250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.385689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.385707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.385724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.385739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.385755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.388088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.388489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.388883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.389282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.389675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.390094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.390495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.390898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.391292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.391721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.391738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.391755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.391771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.391786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.395140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.396784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.095 [2024-07-23 17:36:31.397340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.398935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.399191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.400737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.402456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.402847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.403241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.403667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.403684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.403698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.403713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.403727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.407129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.408666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.409841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.411282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.411599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.413363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.415057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.416593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.417345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.417771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.417787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.417801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.417816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.417830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.421080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.422607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.424358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.426115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.426537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.428190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.429418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.430950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.432489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.432745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.432761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.432776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.432790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.432804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.435972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.437347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.439104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.440700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.440961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.442402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.443572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.444795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.446415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.446671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.446687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.446701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.446716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.446730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.448994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.449390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.450963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.452188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.452444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.454188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.455705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.456473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.458215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.458533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.458549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.458563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.458578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.458591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.460532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.096 [2024-07-23 17:36:31.460937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.461330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.462198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.462454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.463750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.465260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.466778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.468273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.468686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.468702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.468716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.468731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.468744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.471926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.471977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.472373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.472419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.472859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.473258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.473656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.475470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.478240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.478292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.479507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.479553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.479807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.481569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.481615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.483899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.487842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.487897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.489650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.489694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.489956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.491482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.491530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.492729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.492776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.493047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.493064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.493078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.493092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.493106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.495934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.495988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.496377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.496771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.497241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.497644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.497693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.498618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.501026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.502585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.503230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.503279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.503605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.505344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.505396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.505785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.505833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.506297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.506315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.506330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.097 [2024-07-23 17:36:31.506345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.098 [2024-07-23 17:36:31.506359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.509976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.511657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.511709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.513461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.513944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.515704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.515761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.517883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.520081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.520133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.520694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.522341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.522604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.524142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.525680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.525727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.357 [2024-07-23 17:36:31.527289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.527665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.527683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.527698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.527713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.527727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.529257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.530839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.531236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.531284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.531690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.532099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.532147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.532702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.534348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.534616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.534633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.534648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.534662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.534676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.536305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.539342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.539394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.539647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.539664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.539684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.541900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.547601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.358 [2024-07-23 17:36:31.547716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:42:36.930 00:42:36.930 Latency(us) 00:42:36.930 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:36.930 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x0 length 0x100 00:42:36.930 crypto_ram : 6.05 42.33 2.65 0.00 0.00 2940643.06 310013.77 2363399.12 00:42:36.930 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x100 length 0x100 00:42:36.930 crypto_ram : 6.03 32.99 2.06 0.00 0.00 3539978.37 106225.31 3005310.00 00:42:36.930 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x0 length 0x100 00:42:36.930 crypto_ram1 : 6.05 42.32 2.64 0.00 0.00 2843984.58 310013.77 2188332.52 00:42:36.930 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x100 length 0x100 00:42:36.930 crypto_ram1 : 6.10 38.82 2.43 0.00 0.00 3022066.82 91180.52 2757298.98 00:42:36.930 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x0 length 0x100 00:42:36.930 crypto_ram2 : 5.61 268.32 16.77 0.00 0.00 427961.84 80238.86 558024.79 00:42:36.930 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x100 length 0x100 00:42:36.930 crypto_ram2 : 5.70 217.57 13.60 0.00 0.00 515744.33 56303.97 638263.65 00:42:36.930 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x0 length 0x100 00:42:36.930 crypto_ram3 : 5.70 277.71 17.36 0.00 0.00 400812.17 24732.72 452255.39 00:42:36.930 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:42:36.930 Verification LBA range: start 0x100 length 0x100 00:42:36.930 crypto_ram3 : 5.85 232.47 14.53 0.00 0.00 469871.67 23592.96 605438.66 00:42:36.930 =================================================================================================================== 00:42:36.930 Total : 1152.52 72.03 0.00 0.00 822601.24 23592.96 3005310.00 00:42:37.500 00:42:37.500 real 0m9.425s 00:42:37.500 user 0m17.747s 00:42:37.500 sys 0m0.660s 00:42:37.500 17:36:32 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:37.500 17:36:32 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:42:37.500 ************************************ 00:42:37.500 END TEST bdev_verify_big_io 00:42:37.500 ************************************ 00:42:37.500 17:36:32 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:42:37.500 17:36:32 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:42:37.500 17:36:32 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:42:37.500 17:36:32 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:37.500 17:36:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:42:37.500 ************************************ 00:42:37.500 START TEST bdev_write_zeroes 00:42:37.500 ************************************ 00:42:37.500 17:36:32 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:42:37.500 [2024-07-23 17:36:32.781370] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:37.500 [2024-07-23 17:36:32.781434] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151017 ] 00:42:37.500 [2024-07-23 17:36:32.912836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:37.759 [2024-07-23 17:36:32.962402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:37.759 [2024-07-23 17:36:32.983793] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:42:37.759 [2024-07-23 17:36:32.991821] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:42:37.759 [2024-07-23 17:36:32.999840] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:42:37.759 [2024-07-23 17:36:33.107574] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:42:40.292 [2024-07-23 17:36:35.477210] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:42:40.292 [2024-07-23 17:36:35.477274] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:42:40.292 [2024-07-23 17:36:35.477289] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:40.292 [2024-07-23 17:36:35.485229] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:42:40.292 [2024-07-23 17:36:35.485253] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:42:40.292 [2024-07-23 17:36:35.485266] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:40.292 [2024-07-23 17:36:35.493251] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:42:40.292 [2024-07-23 17:36:35.493268] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:42:40.292 [2024-07-23 17:36:35.493280] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:40.292 [2024-07-23 17:36:35.501275] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:42:40.292 [2024-07-23 17:36:35.501293] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:42:40.292 [2024-07-23 17:36:35.501304] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:42:40.292 Running I/O for 1 seconds... 00:42:41.230 00:42:41.230 Latency(us) 00:42:41.230 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:42:41.230 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:42:41.230 crypto_ram : 1.02 2038.31 7.96 0.00 0.00 62317.31 5442.34 75223.93 00:42:41.230 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:42:41.230 crypto_ram1 : 1.03 2051.48 8.01 0.00 0.00 61640.53 5442.34 69753.10 00:42:41.230 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:42:41.230 crypto_ram2 : 1.02 15752.25 61.53 0.00 0.00 8011.84 2436.23 10542.75 00:42:41.230 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:42:41.230 crypto_ram3 : 1.02 15731.04 61.45 0.00 0.00 7987.14 2421.98 8320.22 00:42:41.230 =================================================================================================================== 00:42:41.230 Total : 35573.07 138.96 0.00 0.00 14229.56 2421.98 75223.93 00:42:41.798 00:42:41.798 real 0m4.269s 00:42:41.798 user 0m3.708s 00:42:41.798 sys 0m0.519s 00:42:41.798 17:36:36 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:41.798 17:36:36 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:42:41.798 ************************************ 00:42:41.798 END TEST bdev_write_zeroes 00:42:41.798 ************************************ 00:42:41.798 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:42:41.798 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:42:41.798 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:42:41.798 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:41.798 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:42:41.798 ************************************ 00:42:41.798 START TEST bdev_json_nonenclosed 00:42:41.798 ************************************ 00:42:41.798 17:36:37 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:42:41.798 [2024-07-23 17:36:37.135577] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:41.798 [2024-07-23 17:36:37.135638] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151559 ] 00:42:42.057 [2024-07-23 17:36:37.265716] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:42.057 [2024-07-23 17:36:37.314924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:42.057 [2024-07-23 17:36:37.314997] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:42:42.057 [2024-07-23 17:36:37.315014] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:42:42.057 [2024-07-23 17:36:37.315026] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:42.057 00:42:42.057 real 0m0.325s 00:42:42.057 user 0m0.176s 00:42:42.057 sys 0m0.147s 00:42:42.057 17:36:37 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:42:42.057 17:36:37 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:42.057 17:36:37 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:42:42.057 ************************************ 00:42:42.057 END TEST bdev_json_nonenclosed 00:42:42.057 ************************************ 00:42:42.057 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:42:42.057 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # true 00:42:42.057 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:42:42.057 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:42:42.057 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:42.057 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:42:42.319 ************************************ 00:42:42.319 START TEST bdev_json_nonarray 00:42:42.319 ************************************ 00:42:42.319 17:36:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:42:42.319 [2024-07-23 17:36:37.547397] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:42.319 [2024-07-23 17:36:37.547459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151662 ] 00:42:42.319 [2024-07-23 17:36:37.689146] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:42.618 [2024-07-23 17:36:37.762591] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:42.618 [2024-07-23 17:36:37.762681] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:42:42.618 [2024-07-23 17:36:37.762706] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:42:42.618 [2024-07-23 17:36:37.762722] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:42:42.618 00:42:42.618 real 0m0.379s 00:42:42.618 user 0m0.201s 00:42:42.618 sys 0m0.172s 00:42:42.618 17:36:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:42:42.618 17:36:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:42.618 17:36:37 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:42:42.618 ************************************ 00:42:42.618 END TEST bdev_json_nonarray 00:42:42.618 ************************************ 00:42:42.618 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # true 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:42:42.618 17:36:37 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:42:42.618 00:42:42.618 real 1m14.226s 00:42:42.618 user 2m42.196s 00:42:42.618 sys 0m10.811s 00:42:42.618 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:42:42.618 17:36:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:42:42.618 ************************************ 00:42:42.618 END TEST blockdev_crypto_qat 00:42:42.618 ************************************ 00:42:42.618 17:36:37 -- common/autotest_common.sh@1142 -- # return 0 00:42:42.618 17:36:37 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:42:42.618 17:36:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:42:42.618 17:36:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:42:42.618 17:36:37 -- common/autotest_common.sh@10 -- # set +x 00:42:42.618 ************************************ 00:42:42.618 START TEST chaining 00:42:42.618 ************************************ 00:42:42.618 17:36:38 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:42:42.877 * Looking for test storage... 00:42:42.877 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@7 -- # uname -s 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:42:42.877 17:36:38 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:42:42.877 17:36:38 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:42:42.877 17:36:38 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:42:42.877 17:36:38 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:42.877 17:36:38 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:42.877 17:36:38 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:42.877 17:36:38 chaining -- paths/export.sh@5 -- # export PATH 00:42:42.877 17:36:38 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@47 -- # : 0 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:42:42.877 17:36:38 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:42:42.877 17:36:38 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:42:42.877 17:36:38 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:42:42.877 17:36:38 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:42:42.877 17:36:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@296 -- # e810=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@297 -- # x722=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@298 -- # mlx=() 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@336 -- # return 1 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:42:50.996 WARNING: No supported devices were found, fallback requested for tcp test 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:42:50.996 Cannot find device "nvmf_tgt_br" 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@155 -- # true 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:42:50.996 Cannot find device "nvmf_tgt_br2" 00:42:50.996 17:36:46 chaining -- nvmf/common.sh@156 -- # true 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:42:50.997 Cannot find device "nvmf_tgt_br" 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@158 -- # true 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:42:50.997 Cannot find device "nvmf_tgt_br2" 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@159 -- # true 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:42:50.997 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@162 -- # true 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:42:50.997 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@163 -- # true 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:42:50.997 17:36:46 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:42:51.256 17:36:46 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:42:51.516 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:42:51.516 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.106 ms 00:42:51.516 00:42:51.516 --- 10.0.0.2 ping statistics --- 00:42:51.516 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:42:51.516 rtt min/avg/max/mdev = 0.106/0.106/0.106/0.000 ms 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:42:51.516 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:42:51.516 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.074 ms 00:42:51.516 00:42:51.516 --- 10.0.0.3 ping statistics --- 00:42:51.516 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:42:51.516 rtt min/avg/max/mdev = 0.074/0.074/0.074/0.000 ms 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:42:51.516 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:42:51.516 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.041 ms 00:42:51.516 00:42:51.516 --- 10.0.0.1 ping statistics --- 00:42:51.516 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:42:51.516 rtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@433 -- # return 0 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:42:51.516 17:36:46 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@481 -- # nvmfpid=155426 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:42:51.516 17:36:46 chaining -- nvmf/common.sh@482 -- # waitforlisten 155426 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@829 -- # '[' -z 155426 ']' 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:51.516 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:42:51.516 17:36:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:51.516 [2024-07-23 17:36:46.860314] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:51.516 [2024-07-23 17:36:46.860371] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:42:51.776 [2024-07-23 17:36:46.986617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:51.776 [2024-07-23 17:36:47.051059] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:42:51.776 [2024-07-23 17:36:47.051117] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:42:51.776 [2024-07-23 17:36:47.051135] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:42:51.776 [2024-07-23 17:36:47.051153] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:42:51.776 [2024-07-23 17:36:47.051168] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:42:51.776 [2024-07-23 17:36:47.051204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:42:52.344 17:36:47 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:42:52.344 17:36:47 chaining -- common/autotest_common.sh@862 -- # return 0 00:42:52.344 17:36:47 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:42:52.344 17:36:47 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:42:52.344 17:36:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:52.604 17:36:47 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@69 -- # mktemp 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.G58pYM8opx 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@69 -- # mktemp 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.adPRBiBXbe 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:52.604 malloc0 00:42:52.604 true 00:42:52.604 true 00:42:52.604 [2024-07-23 17:36:47.856310] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:42:52.604 crypto0 00:42:52.604 [2024-07-23 17:36:47.864339] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:42:52.604 crypto1 00:42:52.604 [2024-07-23 17:36:47.872505] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:42:52.604 [2024-07-23 17:36:47.888766] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@85 -- # update_stats 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:52.604 17:36:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:52.604 17:36:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:52.604 17:36:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:52.604 17:36:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:52.604 17:36:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:52.863 17:36:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:52.863 17:36:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:52.863 17:36:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:52.863 17:36:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.G58pYM8opx bs=1K count=64 00:42:52.863 64+0 records in 00:42:52.863 64+0 records out 00:42:52.863 65536 bytes (66 kB, 64 KiB) copied, 0.0010723 s, 61.1 MB/s 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.G58pYM8opx --ob Nvme0n1 --bs 65536 --count 1 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@25 -- # local config 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:42:52.863 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:42:52.863 17:36:48 chaining -- bdev/chaining.sh@31 -- # config='{ 00:42:52.863 "subsystems": [ 00:42:52.863 { 00:42:52.863 "subsystem": "bdev", 00:42:52.863 "config": [ 00:42:52.863 { 00:42:52.863 "method": "bdev_nvme_attach_controller", 00:42:52.863 "params": { 00:42:52.863 "trtype": "tcp", 00:42:52.863 "adrfam": "IPv4", 00:42:52.863 "name": "Nvme0", 00:42:52.863 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:52.863 "traddr": "10.0.0.2", 00:42:52.863 "trsvcid": "4420" 00:42:52.863 } 00:42:52.863 }, 00:42:52.863 { 00:42:52.863 "method": "bdev_set_options", 00:42:52.864 "params": { 00:42:52.864 "bdev_auto_examine": false 00:42:52.864 } 00:42:52.864 } 00:42:52.864 ] 00:42:52.864 } 00:42:52.864 ] 00:42:52.864 }' 00:42:52.864 17:36:48 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.G58pYM8opx --ob Nvme0n1 --bs 65536 --count 1 00:42:52.864 17:36:48 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:42:52.864 "subsystems": [ 00:42:52.864 { 00:42:52.864 "subsystem": "bdev", 00:42:52.864 "config": [ 00:42:52.864 { 00:42:52.864 "method": "bdev_nvme_attach_controller", 00:42:52.864 "params": { 00:42:52.864 "trtype": "tcp", 00:42:52.864 "adrfam": "IPv4", 00:42:52.864 "name": "Nvme0", 00:42:52.864 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:52.864 "traddr": "10.0.0.2", 00:42:52.864 "trsvcid": "4420" 00:42:52.864 } 00:42:52.864 }, 00:42:52.864 { 00:42:52.864 "method": "bdev_set_options", 00:42:52.864 "params": { 00:42:52.864 "bdev_auto_examine": false 00:42:52.864 } 00:42:52.864 } 00:42:52.864 ] 00:42:52.864 } 00:42:52.864 ] 00:42:52.864 }' 00:42:52.864 [2024-07-23 17:36:48.219712] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:52.864 [2024-07-23 17:36:48.219780] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155647 ] 00:42:53.123 [2024-07-23 17:36:48.361477] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:53.123 [2024-07-23 17:36:48.433394] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:53.641  Copying: 64/64 [kB] (average 20 MBps) 00:42:53.641 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:53.641 17:36:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.641 17:36:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.641 17:36:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:53.641 17:36:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.641 17:36:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.641 17:36:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:53.641 17:36:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:53.642 17:36:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:53.642 17:36:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.642 17:36:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.642 17:36:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:53.642 17:36:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:53.642 17:36:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.642 17:36:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.642 17:36:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@96 -- # update_stats 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:53.901 17:36:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.adPRBiBXbe --ib Nvme0n1 --bs 65536 --count 1 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@25 -- # local config 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:42:53.901 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@31 -- # config='{ 00:42:53.901 "subsystems": [ 00:42:53.901 { 00:42:53.901 "subsystem": "bdev", 00:42:53.901 "config": [ 00:42:53.901 { 00:42:53.901 "method": "bdev_nvme_attach_controller", 00:42:53.901 "params": { 00:42:53.901 "trtype": "tcp", 00:42:53.901 "adrfam": "IPv4", 00:42:53.901 "name": "Nvme0", 00:42:53.901 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:53.901 "traddr": "10.0.0.2", 00:42:53.901 "trsvcid": "4420" 00:42:53.901 } 00:42:53.901 }, 00:42:53.901 { 00:42:53.901 "method": "bdev_set_options", 00:42:53.901 "params": { 00:42:53.901 "bdev_auto_examine": false 00:42:53.901 } 00:42:53.901 } 00:42:53.901 ] 00:42:53.901 } 00:42:53.901 ] 00:42:53.901 }' 00:42:53.901 17:36:49 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:42:53.901 "subsystems": [ 00:42:53.901 { 00:42:53.901 "subsystem": "bdev", 00:42:53.901 "config": [ 00:42:53.901 { 00:42:53.901 "method": "bdev_nvme_attach_controller", 00:42:53.901 "params": { 00:42:53.901 "trtype": "tcp", 00:42:53.901 "adrfam": "IPv4", 00:42:53.901 "name": "Nvme0", 00:42:53.901 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:53.901 "traddr": "10.0.0.2", 00:42:53.901 "trsvcid": "4420" 00:42:53.901 } 00:42:53.901 }, 00:42:53.901 { 00:42:53.901 "method": "bdev_set_options", 00:42:53.901 "params": { 00:42:53.901 "bdev_auto_examine": false 00:42:53.901 } 00:42:53.901 } 00:42:53.901 ] 00:42:53.901 } 00:42:53.901 ] 00:42:53.901 }' 00:42:53.902 17:36:49 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.adPRBiBXbe --ib Nvme0n1 --bs 65536 --count 1 00:42:54.160 [2024-07-23 17:36:49.360301] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:54.160 [2024-07-23 17:36:49.360370] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155854 ] 00:42:54.160 [2024-07-23 17:36:49.493447] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:54.160 [2024-07-23 17:36:49.557757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:54.680  Copying: 64/64 [kB] (average 20 MBps) 00:42:54.680 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:54.680 17:36:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:54.680 17:36:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:54.680 17:36:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:54.680 17:36:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:54.680 17:36:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:54.680 17:36:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:54.680 17:36:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:54.680 17:36:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:54.680 17:36:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:54.680 17:36:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:54.939 17:36:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:54.939 17:36:50 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:42:54.939 17:36:50 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:42:54.939 17:36:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:54.940 17:36:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:54.940 17:36:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:54.940 17:36:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.G58pYM8opx /tmp/tmp.adPRBiBXbe 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@25 -- # local config 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:42:54.940 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@31 -- # config='{ 00:42:54.940 "subsystems": [ 00:42:54.940 { 00:42:54.940 "subsystem": "bdev", 00:42:54.940 "config": [ 00:42:54.940 { 00:42:54.940 "method": "bdev_nvme_attach_controller", 00:42:54.940 "params": { 00:42:54.940 "trtype": "tcp", 00:42:54.940 "adrfam": "IPv4", 00:42:54.940 "name": "Nvme0", 00:42:54.940 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:54.940 "traddr": "10.0.0.2", 00:42:54.940 "trsvcid": "4420" 00:42:54.940 } 00:42:54.940 }, 00:42:54.940 { 00:42:54.940 "method": "bdev_set_options", 00:42:54.940 "params": { 00:42:54.940 "bdev_auto_examine": false 00:42:54.940 } 00:42:54.940 } 00:42:54.940 ] 00:42:54.940 } 00:42:54.940 ] 00:42:54.940 }' 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:42:54.940 17:36:50 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:42:54.940 "subsystems": [ 00:42:54.940 { 00:42:54.940 "subsystem": "bdev", 00:42:54.940 "config": [ 00:42:54.940 { 00:42:54.940 "method": "bdev_nvme_attach_controller", 00:42:54.940 "params": { 00:42:54.940 "trtype": "tcp", 00:42:54.940 "adrfam": "IPv4", 00:42:54.940 "name": "Nvme0", 00:42:54.940 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:54.940 "traddr": "10.0.0.2", 00:42:54.940 "trsvcid": "4420" 00:42:54.940 } 00:42:54.940 }, 00:42:54.940 { 00:42:54.940 "method": "bdev_set_options", 00:42:54.940 "params": { 00:42:54.940 "bdev_auto_examine": false 00:42:54.940 } 00:42:54.940 } 00:42:54.940 ] 00:42:54.940 } 00:42:54.940 ] 00:42:54.940 }' 00:42:54.940 [2024-07-23 17:36:50.306588] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:54.940 [2024-07-23 17:36:50.306655] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155970 ] 00:42:55.199 [2024-07-23 17:36:50.445476] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:55.199 [2024-07-23 17:36:50.517496] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:55.717  Copying: 64/64 [kB] (average 31 MBps) 00:42:55.717 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@106 -- # update_stats 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:55.717 17:36:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:55.717 17:36:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:55.717 17:36:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:55.717 17:36:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:55.717 17:36:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:55.717 17:36:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:55.717 17:36:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:55.718 17:36:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:55.718 17:36:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:55.718 17:36:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:55.718 17:36:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:55.718 17:36:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:55.718 17:36:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.G58pYM8opx --ob Nvme0n1 --bs 4096 --count 16 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@25 -- # local config 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:42:55.718 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:42:55.718 17:36:51 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:42:55.977 17:36:51 chaining -- bdev/chaining.sh@31 -- # config='{ 00:42:55.978 "subsystems": [ 00:42:55.978 { 00:42:55.978 "subsystem": "bdev", 00:42:55.978 "config": [ 00:42:55.978 { 00:42:55.978 "method": "bdev_nvme_attach_controller", 00:42:55.978 "params": { 00:42:55.978 "trtype": "tcp", 00:42:55.978 "adrfam": "IPv4", 00:42:55.978 "name": "Nvme0", 00:42:55.978 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:55.978 "traddr": "10.0.0.2", 00:42:55.978 "trsvcid": "4420" 00:42:55.978 } 00:42:55.978 }, 00:42:55.978 { 00:42:55.978 "method": "bdev_set_options", 00:42:55.978 "params": { 00:42:55.978 "bdev_auto_examine": false 00:42:55.978 } 00:42:55.978 } 00:42:55.978 ] 00:42:55.978 } 00:42:55.978 ] 00:42:55.978 }' 00:42:55.978 17:36:51 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.G58pYM8opx --ob Nvme0n1 --bs 4096 --count 16 00:42:55.978 17:36:51 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:42:55.978 "subsystems": [ 00:42:55.978 { 00:42:55.978 "subsystem": "bdev", 00:42:55.978 "config": [ 00:42:55.978 { 00:42:55.978 "method": "bdev_nvme_attach_controller", 00:42:55.978 "params": { 00:42:55.978 "trtype": "tcp", 00:42:55.978 "adrfam": "IPv4", 00:42:55.978 "name": "Nvme0", 00:42:55.978 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:55.978 "traddr": "10.0.0.2", 00:42:55.978 "trsvcid": "4420" 00:42:55.978 } 00:42:55.978 }, 00:42:55.978 { 00:42:55.978 "method": "bdev_set_options", 00:42:55.978 "params": { 00:42:55.978 "bdev_auto_examine": false 00:42:55.978 } 00:42:55.978 } 00:42:55.978 ] 00:42:55.978 } 00:42:55.978 ] 00:42:55.978 }' 00:42:55.978 [2024-07-23 17:36:51.225468] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:55.978 [2024-07-23 17:36:51.225540] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156086 ] 00:42:55.978 [2024-07-23 17:36:51.356216] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:56.237 [2024-07-23 17:36:51.409609] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:56.496  Copying: 64/64 [kB] (average 10 MBps) 00:42:56.496 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:56.496 17:36:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.496 17:36:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.496 17:36:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:56.496 17:36:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:56.496 17:36:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.496 17:36:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.496 17:36:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:56.756 17:36:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.756 17:36:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:56.756 17:36:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:56.756 17:36:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:56.756 17:36:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.756 17:36:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@114 -- # update_stats 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:56.756 17:36:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:56.756 17:36:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:57.015 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@117 -- # : 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.adPRBiBXbe --ib Nvme0n1 --bs 4096 --count 16 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@25 -- # local config 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:42:57.016 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@31 -- # config='{ 00:42:57.016 "subsystems": [ 00:42:57.016 { 00:42:57.016 "subsystem": "bdev", 00:42:57.016 "config": [ 00:42:57.016 { 00:42:57.016 "method": "bdev_nvme_attach_controller", 00:42:57.016 "params": { 00:42:57.016 "trtype": "tcp", 00:42:57.016 "adrfam": "IPv4", 00:42:57.016 "name": "Nvme0", 00:42:57.016 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:57.016 "traddr": "10.0.0.2", 00:42:57.016 "trsvcid": "4420" 00:42:57.016 } 00:42:57.016 }, 00:42:57.016 { 00:42:57.016 "method": "bdev_set_options", 00:42:57.016 "params": { 00:42:57.016 "bdev_auto_examine": false 00:42:57.016 } 00:42:57.016 } 00:42:57.016 ] 00:42:57.016 } 00:42:57.016 ] 00:42:57.016 }' 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.adPRBiBXbe --ib Nvme0n1 --bs 4096 --count 16 00:42:57.016 17:36:52 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:42:57.016 "subsystems": [ 00:42:57.016 { 00:42:57.016 "subsystem": "bdev", 00:42:57.016 "config": [ 00:42:57.016 { 00:42:57.016 "method": "bdev_nvme_attach_controller", 00:42:57.016 "params": { 00:42:57.016 "trtype": "tcp", 00:42:57.016 "adrfam": "IPv4", 00:42:57.016 "name": "Nvme0", 00:42:57.016 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:42:57.016 "traddr": "10.0.0.2", 00:42:57.016 "trsvcid": "4420" 00:42:57.016 } 00:42:57.016 }, 00:42:57.016 { 00:42:57.016 "method": "bdev_set_options", 00:42:57.016 "params": { 00:42:57.016 "bdev_auto_examine": false 00:42:57.016 } 00:42:57.016 } 00:42:57.016 ] 00:42:57.016 } 00:42:57.016 ] 00:42:57.016 }' 00:42:57.016 [2024-07-23 17:36:52.309227] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:57.016 [2024-07-23 17:36:52.309295] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156293 ] 00:42:57.275 [2024-07-23 17:36:52.444139] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:57.275 [2024-07-23 17:36:52.500216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:57.534  Copying: 64/64 [kB] (average 1361 kBps) 00:42:57.534 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:42:57.534 17:36:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:42:57.534 17:36:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:57.534 17:36:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:57.534 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:57.794 17:36:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:42:57.794 17:36:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:57.794 17:36:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:57.794 17:36:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:42:57.794 17:36:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:57.794 17:36:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:57.794 17:36:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:42:57.794 17:36:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:57.794 17:36:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:57.794 17:36:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.G58pYM8opx /tmp/tmp.adPRBiBXbe 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.G58pYM8opx /tmp/tmp.adPRBiBXbe 00:42:57.794 17:36:53 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:42:57.794 17:36:53 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:42:57.794 17:36:53 chaining -- nvmf/common.sh@117 -- # sync 00:42:57.794 17:36:53 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:42:57.794 17:36:53 chaining -- nvmf/common.sh@120 -- # set +e 00:42:57.794 17:36:53 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:42:57.794 17:36:53 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:42:57.794 rmmod nvme_tcp 00:42:57.794 rmmod nvme_fabrics 00:42:57.794 rmmod nvme_keyring 00:42:58.055 17:36:53 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:42:58.055 17:36:53 chaining -- nvmf/common.sh@124 -- # set -e 00:42:58.055 17:36:53 chaining -- nvmf/common.sh@125 -- # return 0 00:42:58.055 17:36:53 chaining -- nvmf/common.sh@489 -- # '[' -n 155426 ']' 00:42:58.055 17:36:53 chaining -- nvmf/common.sh@490 -- # killprocess 155426 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@948 -- # '[' -z 155426 ']' 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@952 -- # kill -0 155426 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@953 -- # uname 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 155426 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 155426' 00:42:58.055 killing process with pid 155426 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@967 -- # kill 155426 00:42:58.055 17:36:53 chaining -- common/autotest_common.sh@972 -- # wait 155426 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:42:58.314 17:36:53 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:42:58.314 17:36:53 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:42:58.314 17:36:53 chaining -- bdev/chaining.sh@132 -- # bperfpid=156505 00:42:58.314 17:36:53 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:42:58.314 17:36:53 chaining -- bdev/chaining.sh@134 -- # waitforlisten 156505 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@829 -- # '[' -z 156505 ']' 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:42:58.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:42:58.314 17:36:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:58.573 [2024-07-23 17:36:53.757383] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:42:58.573 [2024-07-23 17:36:53.757454] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156505 ] 00:42:58.573 [2024-07-23 17:36:53.899554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:42:58.573 [2024-07-23 17:36:53.969494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:42:59.510 17:36:54 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:42:59.510 17:36:54 chaining -- common/autotest_common.sh@862 -- # return 0 00:42:59.510 17:36:54 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:42:59.510 17:36:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:42:59.510 17:36:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:42:59.510 malloc0 00:42:59.510 true 00:42:59.510 true 00:42:59.510 [2024-07-23 17:36:54.773829] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:42:59.510 crypto0 00:42:59.510 [2024-07-23 17:36:54.781858] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:42:59.510 crypto1 00:42:59.510 17:36:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:42:59.510 17:36:54 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:42:59.510 Running I/O for 5 seconds... 00:43:04.781 00:43:04.781 Latency(us) 00:43:04.781 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:04.781 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:43:04.781 Verification LBA range: start 0x0 length 0x2000 00:43:04.781 crypto1 : 5.02 10561.83 41.26 0.00 0.00 24170.30 6382.64 16754.42 00:43:04.781 =================================================================================================================== 00:43:04.781 Total : 10561.83 41.26 0.00 0.00 24170.30 6382.64 16754.42 00:43:04.781 0 00:43:04.781 17:36:59 chaining -- bdev/chaining.sh@146 -- # killprocess 156505 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@948 -- # '[' -z 156505 ']' 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@952 -- # kill -0 156505 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@953 -- # uname 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 156505 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 156505' 00:43:04.781 killing process with pid 156505 00:43:04.781 17:36:59 chaining -- common/autotest_common.sh@967 -- # kill 156505 00:43:04.781 Received shutdown signal, test time was about 5.000000 seconds 00:43:04.781 00:43:04.781 Latency(us) 00:43:04.781 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:04.781 =================================================================================================================== 00:43:04.781 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:04.782 17:36:59 chaining -- common/autotest_common.sh@972 -- # wait 156505 00:43:04.782 17:37:00 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:43:04.782 17:37:00 chaining -- bdev/chaining.sh@152 -- # bperfpid=157319 00:43:04.782 17:37:00 chaining -- bdev/chaining.sh@154 -- # waitforlisten 157319 00:43:04.782 17:37:00 chaining -- common/autotest_common.sh@829 -- # '[' -z 157319 ']' 00:43:05.041 17:37:00 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:05.041 17:37:00 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:43:05.041 17:37:00 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:05.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:05.041 17:37:00 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:43:05.041 17:37:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:05.041 [2024-07-23 17:37:00.251408] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:43:05.041 [2024-07-23 17:37:00.251481] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157319 ] 00:43:05.041 [2024-07-23 17:37:00.374352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:05.041 [2024-07-23 17:37:00.429628] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:43:05.980 17:37:01 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:43:05.980 17:37:01 chaining -- common/autotest_common.sh@862 -- # return 0 00:43:05.980 17:37:01 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:43:05.980 17:37:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:43:05.980 17:37:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:05.980 malloc0 00:43:05.980 true 00:43:05.980 true 00:43:05.980 [2024-07-23 17:37:01.337947] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:43:05.980 [2024-07-23 17:37:01.337996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:43:05.980 [2024-07-23 17:37:01.338018] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2543ef0 00:43:05.980 [2024-07-23 17:37:01.338030] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:43:05.980 [2024-07-23 17:37:01.339099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:43:05.980 [2024-07-23 17:37:01.339127] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:43:05.980 pt0 00:43:05.980 [2024-07-23 17:37:01.345977] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:43:05.980 crypto0 00:43:05.980 [2024-07-23 17:37:01.353999] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:43:05.980 crypto1 00:43:05.980 17:37:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:43:05.980 17:37:01 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:43:06.276 Running I/O for 5 seconds... 00:43:11.572 00:43:11.572 Latency(us) 00:43:11.572 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:11.572 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:43:11.572 Verification LBA range: start 0x0 length 0x2000 00:43:11.572 crypto1 : 5.02 8912.20 34.81 0.00 0.00 28633.62 6553.60 17324.30 00:43:11.572 =================================================================================================================== 00:43:11.572 Total : 8912.20 34.81 0.00 0.00 28633.62 6553.60 17324.30 00:43:11.572 0 00:43:11.572 17:37:06 chaining -- bdev/chaining.sh@167 -- # killprocess 157319 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@948 -- # '[' -z 157319 ']' 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@952 -- # kill -0 157319 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@953 -- # uname 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 157319 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 157319' 00:43:11.572 killing process with pid 157319 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@967 -- # kill 157319 00:43:11.572 Received shutdown signal, test time was about 5.000000 seconds 00:43:11.572 00:43:11.572 Latency(us) 00:43:11.572 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:11.572 =================================================================================================================== 00:43:11.572 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@972 -- # wait 157319 00:43:11.572 17:37:06 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:43:11.572 17:37:06 chaining -- bdev/chaining.sh@170 -- # killprocess 157319 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@948 -- # '[' -z 157319 ']' 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@952 -- # kill -0 157319 00:43:11.572 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (157319) - No such process 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 157319 is not found' 00:43:11.572 Process with pid 157319 is not found 00:43:11.572 17:37:06 chaining -- bdev/chaining.sh@171 -- # wait 157319 00:43:11.572 17:37:06 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:43:11.572 17:37:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@296 -- # e810=() 00:43:11.572 17:37:06 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@297 -- # x722=() 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@298 -- # mlx=() 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@336 -- # return 1 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:43:11.573 WARNING: No supported devices were found, fallback requested for tcp test 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:43:11.573 Cannot find device "nvmf_tgt_br" 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@155 -- # true 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:43:11.573 Cannot find device "nvmf_tgt_br2" 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@156 -- # true 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:43:11.573 Cannot find device "nvmf_tgt_br" 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@158 -- # true 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:43:11.573 Cannot find device "nvmf_tgt_br2" 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@159 -- # true 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:43:11.573 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@162 -- # true 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:43:11.573 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@163 -- # true 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:43:11.573 17:37:06 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:43:11.833 17:37:07 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:43:12.092 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:43:12.092 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.111 ms 00:43:12.092 00:43:12.092 --- 10.0.0.2 ping statistics --- 00:43:12.092 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:43:12.092 rtt min/avg/max/mdev = 0.111/0.111/0.111/0.000 ms 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:43:12.092 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:43:12.092 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.075 ms 00:43:12.092 00:43:12.092 --- 10.0.0.3 ping statistics --- 00:43:12.092 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:43:12.092 rtt min/avg/max/mdev = 0.075/0.075/0.075/0.000 ms 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:43:12.092 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:43:12.092 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.040 ms 00:43:12.092 00:43:12.092 --- 10.0.0.1 ping statistics --- 00:43:12.092 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:43:12.092 rtt min/avg/max/mdev = 0.040/0.040/0.040/0.000 ms 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@433 -- # return 0 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:43:12.092 17:37:07 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@481 -- # nvmfpid=158520 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:43:12.092 17:37:07 chaining -- nvmf/common.sh@482 -- # waitforlisten 158520 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@829 -- # '[' -z 158520 ']' 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:43:12.092 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:43:12.092 17:37:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:12.092 [2024-07-23 17:37:07.491053] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:43:12.092 [2024-07-23 17:37:07.491132] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:43:12.352 [2024-07-23 17:37:07.650522] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:12.352 [2024-07-23 17:37:07.703797] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:43:12.352 [2024-07-23 17:37:07.703849] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:43:12.352 [2024-07-23 17:37:07.703867] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:43:12.352 [2024-07-23 17:37:07.703884] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:43:12.352 [2024-07-23 17:37:07.703903] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:43:12.352 [2024-07-23 17:37:07.703936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@862 -- # return 0 00:43:13.288 17:37:08 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:13.288 17:37:08 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:43:13.288 17:37:08 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:13.288 malloc0 00:43:13.288 [2024-07-23 17:37:08.492427] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:43:13.288 [2024-07-23 17:37:08.508696] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:43:13.288 17:37:08 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:43:13.288 17:37:08 chaining -- bdev/chaining.sh@189 -- # bperfpid=158592 00:43:13.288 17:37:08 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:43:13.288 17:37:08 chaining -- bdev/chaining.sh@191 -- # waitforlisten 158592 /var/tmp/bperf.sock 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@829 -- # '[' -z 158592 ']' 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:43:13.288 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:43:13.288 17:37:08 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:13.288 [2024-07-23 17:37:08.583815] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:43:13.288 [2024-07-23 17:37:08.583881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158592 ] 00:43:13.547 [2024-07-23 17:37:08.725082] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:13.547 [2024-07-23 17:37:08.781369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:43:14.115 17:37:09 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:43:14.115 17:37:09 chaining -- common/autotest_common.sh@862 -- # return 0 00:43:14.115 17:37:09 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:43:14.115 17:37:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:43:14.682 [2024-07-23 17:37:09.803122] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:43:14.682 nvme0n1 00:43:14.682 true 00:43:14.682 crypto0 00:43:14.682 17:37:09 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:43:14.682 Running I/O for 5 seconds... 00:43:19.948 00:43:19.948 Latency(us) 00:43:19.948 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:19.948 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:43:19.948 Verification LBA range: start 0x0 length 0x2000 00:43:19.948 crypto0 : 5.03 6819.00 26.64 0.00 0.00 37414.59 4473.54 27240.18 00:43:19.948 =================================================================================================================== 00:43:19.948 Total : 6819.00 26.64 0.00 0.00 37414.59 4473.54 27240.18 00:43:19.948 0 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@39 -- # opcode= 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:19.948 17:37:14 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@205 -- # sequence=68558 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:43:19.948 17:37:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@206 -- # encrypt=34279 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:43:20.514 17:37:15 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@207 -- # decrypt=34279 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:43:20.772 17:37:16 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:21.031 17:37:16 chaining -- bdev/chaining.sh@208 -- # crc32c=68558 00:43:21.031 17:37:16 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:43:21.031 17:37:16 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:43:21.031 17:37:16 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:43:21.031 17:37:16 chaining -- bdev/chaining.sh@214 -- # killprocess 158592 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@948 -- # '[' -z 158592 ']' 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@952 -- # kill -0 158592 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@953 -- # uname 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 158592 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 158592' 00:43:21.031 killing process with pid 158592 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@967 -- # kill 158592 00:43:21.031 Received shutdown signal, test time was about 5.000000 seconds 00:43:21.031 00:43:21.031 Latency(us) 00:43:21.031 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:21.031 =================================================================================================================== 00:43:21.031 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:21.031 17:37:16 chaining -- common/autotest_common.sh@972 -- # wait 158592 00:43:21.290 17:37:16 chaining -- bdev/chaining.sh@219 -- # bperfpid=159612 00:43:21.290 17:37:16 chaining -- bdev/chaining.sh@221 -- # waitforlisten 159612 /var/tmp/bperf.sock 00:43:21.290 17:37:16 chaining -- common/autotest_common.sh@829 -- # '[' -z 159612 ']' 00:43:21.290 17:37:16 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:43:21.290 17:37:16 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:43:21.290 17:37:16 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:43:21.290 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:43:21.290 17:37:16 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:43:21.290 17:37:16 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:43:21.290 17:37:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:21.290 [2024-07-23 17:37:16.643311] Starting SPDK v24.09-pre git sha1 b8378f94e / DPDK 22.11.4 initialization... 00:43:21.291 [2024-07-23 17:37:16.643381] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid159612 ] 00:43:21.549 [2024-07-23 17:37:16.775710] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:43:21.549 [2024-07-23 17:37:16.827836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:43:22.483 17:37:17 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:43:22.483 17:37:17 chaining -- common/autotest_common.sh@862 -- # return 0 00:43:22.483 17:37:17 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:43:22.483 17:37:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:43:22.742 [2024-07-23 17:37:17.983596] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:43:22.742 nvme0n1 00:43:22.742 true 00:43:22.742 crypto0 00:43:22.742 17:37:18 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:43:22.742 Running I/O for 5 seconds... 00:43:28.018 00:43:28.018 Latency(us) 00:43:28.018 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:28.018 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:43:28.018 Verification LBA range: start 0x0 length 0x200 00:43:28.018 crypto0 : 5.01 1619.99 101.25 0.00 0.00 19370.26 1353.46 20857.54 00:43:28.018 =================================================================================================================== 00:43:28.018 Total : 1619.99 101.25 0.00 0.00 19370.26 1353.46 20857.54 00:43:28.018 0 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@233 -- # sequence=16224 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:43:28.018 17:37:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@234 -- # encrypt=8112 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:43:28.279 17:37:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@235 -- # decrypt=8112 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:43:28.539 17:37:23 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:43:29.142 17:37:24 chaining -- bdev/chaining.sh@236 -- # crc32c=16224 00:43:29.142 17:37:24 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:43:29.142 17:37:24 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:43:29.142 17:37:24 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:43:29.142 17:37:24 chaining -- bdev/chaining.sh@242 -- # killprocess 159612 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@948 -- # '[' -z 159612 ']' 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@952 -- # kill -0 159612 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@953 -- # uname 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 159612 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 159612' 00:43:29.142 killing process with pid 159612 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@967 -- # kill 159612 00:43:29.142 Received shutdown signal, test time was about 5.000000 seconds 00:43:29.142 00:43:29.142 Latency(us) 00:43:29.142 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:43:29.142 =================================================================================================================== 00:43:29.142 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:43:29.142 17:37:24 chaining -- common/autotest_common.sh@972 -- # wait 159612 00:43:29.402 17:37:24 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@117 -- # sync 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@120 -- # set +e 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:43:29.402 rmmod nvme_tcp 00:43:29.402 rmmod nvme_fabrics 00:43:29.402 rmmod nvme_keyring 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@124 -- # set -e 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@125 -- # return 0 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@489 -- # '[' -n 158520 ']' 00:43:29.402 17:37:24 chaining -- nvmf/common.sh@490 -- # killprocess 158520 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@948 -- # '[' -z 158520 ']' 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@952 -- # kill -0 158520 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@953 -- # uname 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 158520 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 158520' 00:43:29.402 killing process with pid 158520 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@967 -- # kill 158520 00:43:29.402 17:37:24 chaining -- common/autotest_common.sh@972 -- # wait 158520 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:43:29.662 17:37:25 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:43:29.662 17:37:25 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:43:29.662 17:37:25 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:43:29.919 17:37:25 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:43:29.919 00:43:29.919 real 0m47.143s 00:43:29.919 user 1m0.121s 00:43:29.919 sys 0m14.273s 00:43:29.919 17:37:25 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:43:29.919 17:37:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:43:29.919 ************************************ 00:43:29.919 END TEST chaining 00:43:29.919 ************************************ 00:43:29.919 17:37:25 -- common/autotest_common.sh@1142 -- # return 0 00:43:29.919 17:37:25 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:43:29.919 17:37:25 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:43:29.919 17:37:25 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:43:29.919 17:37:25 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:43:29.919 17:37:25 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:43:29.919 17:37:25 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:43:29.919 17:37:25 -- common/autotest_common.sh@722 -- # xtrace_disable 00:43:29.919 17:37:25 -- common/autotest_common.sh@10 -- # set +x 00:43:29.919 17:37:25 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:43:29.919 17:37:25 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:43:29.919 17:37:25 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:43:29.919 17:37:25 -- common/autotest_common.sh@10 -- # set +x 00:43:35.189 INFO: APP EXITING 00:43:35.189 INFO: killing all VMs 00:43:35.189 INFO: killing vhost app 00:43:35.189 INFO: EXIT DONE 00:43:38.480 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:43:38.480 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:43:38.480 Waiting for block devices as requested 00:43:38.480 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:43:38.738 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:43:38.738 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:43:38.738 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:43:38.997 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:43:38.997 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:43:38.997 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:43:39.256 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:43:39.256 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:43:39.256 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:43:39.516 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:43:39.516 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:43:39.516 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:43:39.775 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:43:39.775 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:43:39.775 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:43:40.034 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:43:44.228 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:43:44.229 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:43:44.229 Cleaning 00:43:44.229 Removing: /var/run/dpdk/spdk0/config 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:43:44.229 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:43:44.229 Removing: /var/run/dpdk/spdk0/hugepage_info 00:43:44.229 Removing: /dev/shm/nvmf_trace.0 00:43:44.229 Removing: /dev/shm/spdk_tgt_trace.pid4067328 00:43:44.229 Removing: /var/run/dpdk/spdk0 00:43:44.229 Removing: /var/run/dpdk/spdk_pid102539 00:43:44.229 Removing: /var/run/dpdk/spdk_pid106955 00:43:44.229 Removing: /var/run/dpdk/spdk_pid108561 00:43:44.229 Removing: /var/run/dpdk/spdk_pid110157 00:43:44.229 Removing: /var/run/dpdk/spdk_pid111880 00:43:44.229 Removing: /var/run/dpdk/spdk_pid113009 00:43:44.229 Removing: /var/run/dpdk/spdk_pid119241 00:43:44.229 Removing: /var/run/dpdk/spdk_pid123992 00:43:44.229 Removing: /var/run/dpdk/spdk_pid124560 00:43:44.229 Removing: /var/run/dpdk/spdk_pid125064 00:43:44.229 Removing: /var/run/dpdk/spdk_pid127279 00:43:44.229 Removing: /var/run/dpdk/spdk_pid129125 00:43:44.229 Removing: /var/run/dpdk/spdk_pid130942 00:43:44.229 Removing: /var/run/dpdk/spdk_pid132012 00:43:44.488 Removing: /var/run/dpdk/spdk_pid133234 00:43:44.488 Removing: /var/run/dpdk/spdk_pid133775 00:43:44.488 Removing: /var/run/dpdk/spdk_pid133884 00:43:44.488 Removing: /var/run/dpdk/spdk_pid134027 00:43:44.488 Removing: /var/run/dpdk/spdk_pid134236 00:43:44.488 Removing: /var/run/dpdk/spdk_pid134420 00:43:44.488 Removing: /var/run/dpdk/spdk_pid135624 00:43:44.488 Removing: /var/run/dpdk/spdk_pid137157 00:43:44.488 Removing: /var/run/dpdk/spdk_pid138653 00:43:44.488 Removing: /var/run/dpdk/spdk_pid139366 00:43:44.488 Removing: /var/run/dpdk/spdk_pid140092 00:43:44.488 Removing: /var/run/dpdk/spdk_pid140420 00:43:44.488 Removing: /var/run/dpdk/spdk_pid140473 00:43:44.488 Removing: /var/run/dpdk/spdk_pid140501 00:43:44.488 Removing: /var/run/dpdk/spdk_pid141435 00:43:44.488 Removing: /var/run/dpdk/spdk_pid141985 00:43:44.488 Removing: /var/run/dpdk/spdk_pid142516 00:43:44.488 Removing: /var/run/dpdk/spdk_pid144645 00:43:44.488 Removing: /var/run/dpdk/spdk_pid146528 00:43:44.488 Removing: /var/run/dpdk/spdk_pid148732 00:43:44.488 Removing: /var/run/dpdk/spdk_pid149798 00:43:44.488 Removing: /var/run/dpdk/spdk_pid151017 00:43:44.488 Removing: /var/run/dpdk/spdk_pid151559 00:43:44.488 Removing: /var/run/dpdk/spdk_pid151662 00:43:44.488 Removing: /var/run/dpdk/spdk_pid155647 00:43:44.488 Removing: /var/run/dpdk/spdk_pid155854 00:43:44.488 Removing: /var/run/dpdk/spdk_pid155970 00:43:44.488 Removing: /var/run/dpdk/spdk_pid156086 00:43:44.488 Removing: /var/run/dpdk/spdk_pid156293 00:43:44.488 Removing: /var/run/dpdk/spdk_pid156505 00:43:44.488 Removing: /var/run/dpdk/spdk_pid157319 00:43:44.488 Removing: /var/run/dpdk/spdk_pid158592 00:43:44.488 Removing: /var/run/dpdk/spdk_pid159612 00:43:44.488 Removing: /var/run/dpdk/spdk_pid20054 00:43:44.488 Removing: /var/run/dpdk/spdk_pid24427 00:43:44.488 Removing: /var/run/dpdk/spdk_pid25408 00:43:44.488 Removing: /var/run/dpdk/spdk_pid26555 00:43:44.488 Removing: /var/run/dpdk/spdk_pid29611 00:43:44.488 Removing: /var/run/dpdk/spdk_pid34943 00:43:44.488 Removing: /var/run/dpdk/spdk_pid37461 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4066483 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4067328 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4067859 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4068588 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4068782 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4069606 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4069710 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4069992 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4072607 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4074125 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4074359 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4074667 00:43:44.488 Removing: /var/run/dpdk/spdk_pid4075010 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4075308 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4075516 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4075720 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4076014 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4076646 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4079440 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4079653 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4079909 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4080129 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4080163 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4080409 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4080687 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4080901 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4081288 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4081785 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4082050 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4082243 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4082443 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4082646 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4082858 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4083144 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4083392 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4083591 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4083784 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4083984 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4084181 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4084383 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4084600 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4084904 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4085134 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4085332 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4085530 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4085887 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4086100 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4086463 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4086712 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4087035 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4087386 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4087604 00:43:44.747 Removing: /var/run/dpdk/spdk_pid4087836 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4088093 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4088560 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4088930 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4089113 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4093161 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4094891 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4096494 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4097382 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4098458 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4098824 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4098846 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4098904 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4102665 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4103215 00:43:44.748 Removing: /var/run/dpdk/spdk_pid4104118 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4104474 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4110482 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4112212 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4113096 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4117339 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4119302 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4120294 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4124541 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4126969 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4128034 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4138519 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4140740 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4141729 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4151982 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4154192 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4155334 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4166437 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4169536 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4170507 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4182273 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4184710 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4185864 00:43:45.007 Removing: /var/run/dpdk/spdk_pid41962 00:43:45.007 Removing: /var/run/dpdk/spdk_pid4512 00:43:45.007 Removing: /var/run/dpdk/spdk_pid45504 00:43:45.007 Removing: /var/run/dpdk/spdk_pid51283 00:43:45.007 Removing: /var/run/dpdk/spdk_pid54082 00:43:45.007 Removing: /var/run/dpdk/spdk_pid60393 00:43:45.007 Removing: /var/run/dpdk/spdk_pid62812 00:43:45.007 Removing: /var/run/dpdk/spdk_pid69134 00:43:45.007 Removing: /var/run/dpdk/spdk_pid71738 00:43:45.007 Removing: /var/run/dpdk/spdk_pid7654 00:43:45.007 Removing: /var/run/dpdk/spdk_pid78570 00:43:45.007 Removing: /var/run/dpdk/spdk_pid81631 00:43:45.007 Removing: /var/run/dpdk/spdk_pid86353 00:43:45.007 Removing: /var/run/dpdk/spdk_pid8668 00:43:45.007 Removing: /var/run/dpdk/spdk_pid86708 00:43:45.007 Removing: /var/run/dpdk/spdk_pid87056 00:43:45.007 Removing: /var/run/dpdk/spdk_pid87418 00:43:45.007 Removing: /var/run/dpdk/spdk_pid87853 00:43:45.007 Removing: /var/run/dpdk/spdk_pid88542 00:43:45.007 Removing: /var/run/dpdk/spdk_pid89288 00:43:45.007 Removing: /var/run/dpdk/spdk_pid89562 00:43:45.007 Removing: /var/run/dpdk/spdk_pid91186 00:43:45.007 Removing: /var/run/dpdk/spdk_pid92928 00:43:45.007 Removing: /var/run/dpdk/spdk_pid94534 00:43:45.007 Removing: /var/run/dpdk/spdk_pid95782 00:43:45.007 Clean 00:43:45.266 17:37:40 -- common/autotest_common.sh@1451 -- # return 0 00:43:45.266 17:37:40 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:43:45.266 17:37:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:43:45.266 17:37:40 -- common/autotest_common.sh@10 -- # set +x 00:43:45.266 17:37:40 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:43:45.266 17:37:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:43:45.266 17:37:40 -- common/autotest_common.sh@10 -- # set +x 00:43:45.266 17:37:40 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:43:45.266 17:37:40 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:43:45.266 17:37:40 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:43:45.266 17:37:40 -- spdk/autotest.sh@391 -- # hash lcov 00:43:45.266 17:37:40 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:43:45.266 17:37:40 -- spdk/autotest.sh@393 -- # hostname 00:43:45.266 17:37:40 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:43:45.525 geninfo: WARNING: invalid characters removed from testname! 00:44:17.659 17:38:08 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:44:17.659 17:38:12 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:44:20.193 17:38:14 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:44:22.726 17:38:17 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:44:25.258 17:38:20 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:44:27.793 17:38:22 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:44:30.331 17:38:25 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:44:30.331 17:38:25 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:44:30.331 17:38:25 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:44:30.331 17:38:25 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:44:30.331 17:38:25 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:44:30.331 17:38:25 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:44:30.331 17:38:25 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:44:30.331 17:38:25 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:44:30.331 17:38:25 -- paths/export.sh@5 -- $ export PATH 00:44:30.331 17:38:25 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:44:30.331 17:38:25 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:44:30.331 17:38:25 -- common/autobuild_common.sh@447 -- $ date +%s 00:44:30.331 17:38:25 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721749105.XXXXXX 00:44:30.331 17:38:25 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721749105.MBKtt3 00:44:30.331 17:38:25 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:44:30.331 17:38:25 -- common/autobuild_common.sh@453 -- $ '[' -n v22.11.4 ']' 00:44:30.331 17:38:25 -- common/autobuild_common.sh@454 -- $ dirname /var/jenkins/workspace/crypto-phy-autotest/dpdk/build 00:44:30.331 17:38:25 -- common/autobuild_common.sh@454 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk' 00:44:30.331 17:38:25 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:44:30.331 17:38:25 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/dpdk --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:44:30.331 17:38:25 -- common/autobuild_common.sh@463 -- $ get_config_params 00:44:30.331 17:38:25 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:44:30.331 17:38:25 -- common/autotest_common.sh@10 -- $ set +x 00:44:30.331 17:38:25 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/crypto-phy-autotest/dpdk/build' 00:44:30.332 17:38:25 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:44:30.332 17:38:25 -- pm/common@17 -- $ local monitor 00:44:30.332 17:38:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:30.332 17:38:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:30.332 17:38:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:30.332 17:38:25 -- pm/common@21 -- $ date +%s 00:44:30.332 17:38:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:30.332 17:38:25 -- pm/common@21 -- $ date +%s 00:44:30.332 17:38:25 -- pm/common@25 -- $ sleep 1 00:44:30.332 17:38:25 -- pm/common@21 -- $ date +%s 00:44:30.332 17:38:25 -- pm/common@21 -- $ date +%s 00:44:30.332 17:38:25 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721749105 00:44:30.332 17:38:25 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721749105 00:44:30.332 17:38:25 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721749105 00:44:30.332 17:38:25 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721749105 00:44:30.332 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721749105_collect-vmstat.pm.log 00:44:30.332 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721749105_collect-cpu-load.pm.log 00:44:30.591 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721749105_collect-cpu-temp.pm.log 00:44:30.591 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721749105_collect-bmc-pm.bmc.pm.log 00:44:31.530 17:38:26 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:44:31.530 17:38:26 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:44:31.530 17:38:26 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:44:31.530 17:38:26 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:44:31.530 17:38:26 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:44:31.530 17:38:26 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:44:31.530 17:38:26 -- spdk/autopackage.sh@19 -- $ timing_finish 00:44:31.530 17:38:26 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:44:31.530 17:38:26 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:44:31.530 17:38:26 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:44:31.530 17:38:26 -- spdk/autopackage.sh@20 -- $ exit 0 00:44:31.530 17:38:26 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:44:31.530 17:38:26 -- pm/common@29 -- $ signal_monitor_resources TERM 00:44:31.530 17:38:26 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:44:31.530 17:38:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:31.530 17:38:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:44:31.530 17:38:26 -- pm/common@44 -- $ pid=170869 00:44:31.530 17:38:26 -- pm/common@50 -- $ kill -TERM 170869 00:44:31.530 17:38:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:31.530 17:38:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:44:31.530 17:38:26 -- pm/common@44 -- $ pid=170871 00:44:31.530 17:38:26 -- pm/common@50 -- $ kill -TERM 170871 00:44:31.530 17:38:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:31.530 17:38:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:44:31.530 17:38:26 -- pm/common@44 -- $ pid=170873 00:44:31.530 17:38:26 -- pm/common@50 -- $ kill -TERM 170873 00:44:31.530 17:38:26 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:44:31.530 17:38:26 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:44:31.530 17:38:26 -- pm/common@44 -- $ pid=170900 00:44:31.530 17:38:26 -- pm/common@50 -- $ sudo -E kill -TERM 170900 00:44:31.530 + [[ -n 3895186 ]] 00:44:31.530 + sudo kill 3895186 00:44:31.540 [Pipeline] } 00:44:31.560 [Pipeline] // stage 00:44:31.565 [Pipeline] } 00:44:31.583 [Pipeline] // timeout 00:44:31.589 [Pipeline] } 00:44:31.606 [Pipeline] // catchError 00:44:31.612 [Pipeline] } 00:44:31.631 [Pipeline] // wrap 00:44:31.638 [Pipeline] } 00:44:31.654 [Pipeline] // catchError 00:44:31.665 [Pipeline] stage 00:44:31.667 [Pipeline] { (Epilogue) 00:44:31.683 [Pipeline] catchError 00:44:31.685 [Pipeline] { 00:44:31.700 [Pipeline] echo 00:44:31.702 Cleanup processes 00:44:31.709 [Pipeline] sh 00:44:31.995 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:44:31.995 170976 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:44:31.995 171190 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:44:32.009 [Pipeline] sh 00:44:32.294 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:44:32.294 ++ grep -v 'sudo pgrep' 00:44:32.294 ++ awk '{print $1}' 00:44:32.294 + sudo kill -9 170976 00:44:32.307 [Pipeline] sh 00:44:32.594 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:44:47.527 [Pipeline] sh 00:44:47.811 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:44:48.070 Artifacts sizes are good 00:44:48.084 [Pipeline] archiveArtifacts 00:44:48.091 Archiving artifacts 00:44:48.245 [Pipeline] sh 00:44:48.526 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:44:48.541 [Pipeline] cleanWs 00:44:48.550 [WS-CLEANUP] Deleting project workspace... 00:44:48.550 [WS-CLEANUP] Deferred wipeout is used... 00:44:48.557 [WS-CLEANUP] done 00:44:48.558 [Pipeline] } 00:44:48.578 [Pipeline] // catchError 00:44:48.590 [Pipeline] sh 00:44:48.870 + logger -p user.info -t JENKINS-CI 00:44:48.879 [Pipeline] } 00:44:48.895 [Pipeline] // stage 00:44:48.901 [Pipeline] } 00:44:48.917 [Pipeline] // node 00:44:48.923 [Pipeline] End of Pipeline 00:44:48.958 Finished: SUCCESS